WorldWideScience

Sample records for automatic data processing

  1. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  2. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  3. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  4. The Use of Small Business Administration Section 8(a) Contractors in Automatic Data Processing Acquisitions.

    Science.gov (United States)

    1992-11-25

    OFFICE OF THE INSPECTOR GENERAL THE USE OF SMALL BUSINESS ADMINISTRATION SECTION 8(a) CONTRACTORS IN AUTOMATIC DATA PROCESSING ACQUISITIONS...Acquisition Agency OMB Office of Management and Budget SADBU Office of Small and Disadvantaged Business Utilization SBA .Small Business Administration SIC...Of The Inspector General: The Use Of Small Business Administration Section 8(a) Contractors in Automatic Data Processing Acquisitions Corporate

  5. Indentification and structuring of data for automatic processing

    International Nuclear Information System (INIS)

    Wohland, H.; Rexer, G.; Ruehle, R.

    1976-01-01

    The data structure of a technical and scientific application system is described. The description of the structure is divided in different sections where the user can describe his own data. By fixing a section of this structure, a high degree of automation of the problem solving process can be achieved while preserving flexibility. (orig.) [de

  6. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  7. Automatic Geometric Processing for Very High Resolution Optical Satellite Data Based on Vector Roads and Orthophotos

    Directory of Open Access Journals (Sweden)

    Peter Pehani

    2016-04-01

    Full Text Available In response to the increasing need for fast satellite image processing SPACE-SI developed STORM—a fully automatic image processing chain that performs all processing steps from the input optical images to web-delivered map-ready products for various sensors. This paper focuses on the automatic geometric corrections module and its adaptation to very high resolution (VHR multispectral images. In the automatic ground control points (GCPs extraction sub-module a two-step algorithm that utilizes vector roads as a reference layer and delivers GCPs for high resolution RapidEye images with near pixel accuracy was initially implemented. Super-fine positioning of individual GCPs onto an aerial orthophoto was introduced for VHR images. The enhanced algorithm is capable of achieving accuracy of approximately 1.5 pixels on WorldView-2 data. In the case of RapidEye images the accuracies of the physical sensor model reach sub-pixel values at independent check points. When compared to the reference national aerial orthophoto the accuracies of WorldView-2 orthoimages automatically produced with the rational function model reach near-pixel values. On a heterogeneous set of 41 RapidEye images the rate of automatic processing reached 97.6%. Image processing times remained under one hour for standard-size images of both sensor types.

  8. Automatic Near-Real-Time Image Processing Chain for Very High Resolution Optical Satellite Data

    Science.gov (United States)

    Ostir, K.; Cotar, K.; Marsetic, A.; Pehani, P.; Perse, M.; Zaksek, K.; Zaletelj, J.; Rodic, T.

    2015-04-01

    In response to the increasing need for automatic and fast satellite image processing SPACE-SI has developed and implemented a fully automatic image processing chain STORM that performs all processing steps from sensor-corrected optical images (level 1) to web-delivered map-ready images and products without operator's intervention. Initial development was tailored to high resolution RapidEye images, and all crucial and most challenging parts of the planned full processing chain were developed: module for automatic image orthorectification based on a physical sensor model and supported by the algorithm for automatic detection of ground control points (GCPs); atmospheric correction module, topographic corrections module that combines physical approach with Minnaert method and utilizing anisotropic illumination model; and modules for high level products generation. Various parts of the chain were implemented also for WorldView-2, THEOS, Pleiades, SPOT 6, Landsat 5-8, and PROBA-V. Support of full-frame sensor currently in development by SPACE-SI is in plan. The proposed paper focuses on the adaptation of the STORM processing chain to very high resolution multispectral images. The development concentrated on the sub-module for automatic detection of GCPs. The initially implemented two-step algorithm that worked only with rasterized vector roads and delivered GCPs with sub-pixel accuracy for the RapidEye images, was improved with the introduction of a third step: super-fine positioning of each GCP based on a reference raster chip. The added step exploits the high spatial resolution of the reference raster to improve the final matching results and to achieve pixel accuracy also on very high resolution optical satellite data.

  9. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  10. Automatic processing of high-rate, high-density multibeam echosounder data

    Science.gov (United States)

    Calder, B. R.; Mayer, L. A.

    2003-06-01

    Multibeam echosounders (MBES) are currently the best way to determine the bathymetry of large regions of the seabed with high accuracy. They are becoming the standard instrument for hydrographic surveying and are also used in geological studies, mineral exploration and scientific investigation of the earth's crustal deformations and life cycle. The significantly increased data density provided by an MBES has significant advantages in accurately delineating the morphology of the seabed, but comes with the attendant disadvantage of having to handle and process a much greater volume of data. Current data processing approaches typically involve (computer aided) human inspection of all data, with time-consuming and subjective assessment of all data points. As data rates increase with each new generation of instrument and required turn-around times decrease, manual approaches become unwieldy and automatic methods of processing essential. We propose a new method for automatically processing MBES data that attempts to address concerns of efficiency, objectivity, robustness and accuracy. The method attributes each sounding with an estimate of vertical and horizontal error, and then uses a model of information propagation to transfer information about the depth from each sounding to its local neighborhood. Embedded in the survey area are estimation nodes that aim to determine the true depth at an absolutely defined location, along with its associated uncertainty. As soon as soundings are made available, the nodes independently assimilate propagated information to form depth hypotheses which are then tracked and updated on-line as more data is gathered. Consequently, we can extract at any time a "current-best" estimate for all nodes, plus co-located uncertainties and other metrics. The method can assimilate data from multiple surveys, multiple instruments or repeated passes of the same instrument in real-time as data is being gathered. The data assimilation scheme is

  11. Processing IMS data automatically: A case study of the Chelyabinsk bolide

    Science.gov (United States)

    Arrowsmith, S.; Marcillo, O. E.; Blom, P. S.; Whitaker, R. W.; Randall, G. E.

    2013-12-01

    We present automatic algorithms for detection, association, and location of infrasound events using the International Monitoring System (IMS) infrasound network. Each algorithm is based on probabilistic considerations that formally account for uncertainties at both the station and network levels. Our method is applied to two days of data that include infrasound signals from the Chelyabinsk bolide. We summarize the automatic detections, global association and localization of the bolide and discuss steps we are taking to improve the methodology based on these results.

  12. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  13. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  14. On automatic data processing and well-test analysis in real-time reservoir management applications

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Stig

    2011-06-15

    The use of pressure and rate sensors for continuous measurements in the oil and gas wells are becoming more common. This provides better and more measurements in real time that can be analyzed to optimize the extraction of oil and gas. An analysis which can provide valuable information on oil and gas production, is transient analysis. In transient analysis pressure build-up in a well when it closed in are analyzed and parameters that describe the flow of oil and gas in the reservoir is estimated. However, it is very time consuming to manage and analyze real-time data and the result is often that only a limited amount of the available data are analyzed. It is therefore desirable to have more effective methods to analyze real time data from oil and gas wells. Olsen automated transient analysis in order to extract the information of real-time data in an efficient and labor-saving manner. The analysis must be initialized with well and reservoir-specific data, but when this is done, the analysis is performed automatically each time the well is closed in. For each shut-in are parameters that describe the flow of oil and gas in the reservoir estimated. By repeated shut, it will then appear time series of estimated parameters. One of the goals of the automated transient analysis lights up is to detect any changes in these time series so that the focus of the engineers can aim on the analysis results that deviate from normal. As part of this work it was also necessary to develop automated data filters for noise removal and data compression. The filter is designed so that it continuously filters the data using methods that are optimized for use on the typical pressure and rate signals measured in the oil and gas wells. The thesis shows Olsen examples of the use of automated data filtering and automated transient analysis of both synthetic data and real data from a field in the North Sea. (AG)

  15. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    Science.gov (United States)

    2017-01-01

    Atmospheric Administration (NOAA) tides and currents applications program interface ( API ): http://tidesandcurrents.noaa.gov/ api /. AIS data AIS...files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...McKinney, W. 2012. Python for data analysis. Sebastopol, CA: O’Reilly Media, Inc. Mitchell, K. N. April. 2012. A review of coastal navigation asset

  16. From sequencer to supercomputer: an automatic pipeline for managing and processing next generation sequencing data.

    Science.gov (United States)

    Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun

    2012-01-01

    Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.

  17. Automatic data-processing equipment of moon mark of nail for verifying some experiential theory of Traditional Chinese Medicine.

    Science.gov (United States)

    Niu, Renjie; Fu, Chenyu; Xu, Zhiyong; Huang, Jianyuan

    2016-04-29

    Doctors who practice Traditional Chinese Medicine (TCM) diagnose using four methods - inspection, auscultation and olfaction, interrogation, and pulse feeling/palpation. The shape and shape changes of the moon marks on the nails are an important indication when judging the patient's health. There are a series of classical and experimental theories about moon marks in TCM, which does not have support from statistical data. To verify some experiential theories on moon mark in TCM by automatic data-processing equipment. This paper proposes the equipment that utilizes image processing technology to collect moon mark data of different target groups conveniently and quickly, building a database that combines this information with that gathered from the health and mental status questionnaire in each test. This equipment has a simple design, a low cost, and an optimized algorithm. The practice has been proven to quickly complete automatic acquisition and preservation of key data about moon marks. In the future, some conclusions will likely be obtained from these data; some changes of moon marks related to a special pathological change will be established with statistical methods.

  18. Automatic processing, quality assurance and serving of real-time weather data

    Science.gov (United States)

    Williams, Matthew; Cornford, Dan; Bastin, Lucy; Jones, Richard; Parker, Stephen

    2011-03-01

    Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts, a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the world. Despite the abundance of available data, the production of usable information about the weather in individual local neighbourhoods requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this instance, this allows a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods provided by the INTAMAP project. A simplified example illustrates how the INTAMAP web processing service can be employed as part of a quality control procedure to estimate the bias and residual variance of user contributed temperature observations, using a reference standard based on temperature observations with carefully controlled quality. We also consider how the uncertainty introduced by the interpolation can be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

  19. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liu

    2011-09-01

    Full Text Available Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  20. Automatic defect detection for TFT-LCD array process using quasiconformal kernel support vector data description.

    Science.gov (United States)

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  1. To the problem of topological optimization of data processing and transmission networks in development of the automatic control system ''Atom''

    International Nuclear Information System (INIS)

    Gal'berg, V.P.

    1981-01-01

    Some optimization problems occurring in developing the automatic control system (ASC) of a commercial amalgamation (ACS-ATOM), assessments of economically optimal structure of location of computation centres and means of data transmission in particular are considered [ru

  2. Selective and validated data processing techniques for performance improvement of automatic lines

    Directory of Open Access Journals (Sweden)

    D’Aponte Francesco

    2016-01-01

    Full Text Available Optimization of the data processing techniques of accelerometers and force transducers allowed to get information about actions in order to improve the behavior of a cutting stage of a converting machinery for diapers production. In particular, different mechanical configurations have been studied and compared in order to reduce the solicitations due to the impacts between knives and anvil, to get clean and accurate cuts and to reduce wear of knives themselves. Reducing the uncertainty of measurements allowed to correctly individuate the best configuration for the pneumatic system that realize the coupling between anvil and knife. The size of pipes, the working pressure and the type of the fluid used in the coupling system have been examined. Experimental results obtained by means of acceleration and force measurements allowed to identify in a reproducible and coherent way the geometry of the pushing device and the working pressure range of the hydraulic fluid. The remarkable reduction of knife and anvil vibrations is expected to strongly reduce the wear of the cutting stage components.

  3. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    Science.gov (United States)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  4. Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data

    Science.gov (United States)

    Parida, G.; Rajan, K. S.

    2017-05-01

    The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  5. AUTORED - the JADE automatic data reduction system

    International Nuclear Information System (INIS)

    Whittaker, J.B.

    1984-07-01

    The design and implementation of and experience with an automatic data processing system for the reduction of data from the JADE experiment at DESY is described. The central elements are a database and a job submitter which combine powerfully to minimise the need for manual intervention. (author)

  6. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  7. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations......, and can easily be adapted to situational circumstances. Both the perception of advertising and the way advertising influences brand evaluation involves both processes. Automatic processes govern the recognition of advertising stimuli, the relevance decision which determines further higher-level processing...

  8. Data-driven management using quantitative metric and automatic auditing program (QMAP) improves consistency of radiation oncology processes.

    Science.gov (United States)

    Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H

    Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  9. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  10. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  11. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  12. Advanced method for automatic processing of seismic and infra-sound data; Methodes avancees de traitement automatique de donnees sismiques et infrasoniques

    Energy Technology Data Exchange (ETDEWEB)

    Cansi, Y.; Crusem, R. [CEA Centre d`Etudes de Limeil, 94 - Villeneuve-Saint-Georges (France)

    1997-11-01

    Governmental organizations have manifested their need for rapid and precise information in the two main fields covered by operational seismology, i.e.: major earthquake alerts and the detection of nuclear explosions. To satisfy both of these constraints, it is necessary to implement increasingly elaborate automation methods for processing the data. Automatic processing methods are mainly based on the flowing elementary steps: detection of a seismic signal on a recording; identification of the type of wave associated with the signal; linking of the different detected arrivals to the same seismic event; localization of the source, which also determines the characteristics of the event. Otherwise, two main categories of processing may be distinguished: methods suitable for large aperture networks, which are characterized by single-channel treatment for detection and identification, and antenna-type methods which are based on searching for consistent signals on the scale of the net work. Within the two main fields of research mentioned here, our effort has focused on regional-scale seismic waves in relation to large-aperture networks as well as on detection techniques using a mini-network (antenna). We have taken advantage of the extensive set of examples in order to implement an automatic procedure for identifying regional seismic waves on single-channel recordings. With the mini-networks, we have developed a novel method universally applicable and successfully applied to various different types of recording (e.g. seismic, micro-barometric, etc) and networks adapted to different wavelength bands. (authors) 7 refs.

  13. Advances in automatic data analysis capabilities

    International Nuclear Information System (INIS)

    Benson, J.; Bipes, T.; Udpa, L.

    2009-01-01

    Utilities perform eddy current tests on nuclear power plant steam generator (SG) tubes to detect degradation. This paper summarizes the Electric Power Research Institute (EPRI) research to develop signal-processing algorithms that automate the analysis of eddy current test data. The research focuses on analyzing rotating probe and array probe data for detecting, classifying, and characterizing degradation in SG tubes. Automated eddy current data analysis systems for bobbin coil probe data have been available for more than a decade. However, automated data analysis systems for rotating and array probes have developed slowly because of the complexities of the inspection parameters associated with the data. Manual analysis of rotating probe data has been shown to be inconsistent and time consuming when flaw depth profiles are generated. Algorithms have been developed for detection of most all common steam generator degradation mechanisms. Included in the latest version of the developed software is the ability to perform automated defect profiling which is useful in tube integrity determinations. Profiling performed manually can be time consuming whereas automated profiling is performed in a fraction of the time and is much more repeatable. Recent advances in eddy current probe development have resulted in an array probe design capable of high-speed data acquisition over the full length of SG tubes. Probe qualification programs have demonstrated that array probes are capable of providing similar degradation detection capabilities to the rotating probe technology. However, to date, utilities have not used the array probe in the field on a large-scale basis due to the large amount of data analyst resources and time required to process the vast quantity of data generated by the probe. To address this obstacle, EPRI initiated a program to develop automatic data analysis algorithms for rotating and array probes. During the development process for both rotating and array

  14. Control of automatic processes: A parallel distributed-processing model of the stroop effect. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.D.; Dunbar, K.; McClelland, J.L.

    1988-06-16

    A growing body of evidence suggests that traditional views of automaticity are in need of revision. For example, automaticity has often been treated as an all-or-none phenomenon, and traditional theories have held that automatic processes are independent of attention. Yet recent empirial data suggests that automatic processes are continuous, and furthermore are subject to attentional control. In this paper we present a model of attention which addresses these issues. Using a parallel distributed processing framework we propose that the attributes of automaticity depend upon the strength of a process and that strength increases with training. Using the Stroop effect as an example, we show how automatic processes are continuous and emerge gradually with practice. Specifically, we present a computational model of the Stroop task which simulates the time course of processing as well as the effects of learning.

  15. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  16. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    of identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values....

  17. Automatic transformations in the inference process

    Energy Technology Data Exchange (ETDEWEB)

    Veroff, R. L.

    1980-07-01

    A technique for incorporating automatic transformations into processes such as the application of inference rules, subsumption, and demodulation provides a mechanism for improving search strategies for theorem proving problems arising from the field of program verification. The incorporation of automatic transformations into the inference process can alter the search space for a given problem, and is particularly useful for problems having broad rather than deep proofs. The technique can also be used to permit the generation of inferences that might otherwise be blocked and to build some commutativity or associativity into the unification process. Appropriate choice of transformations, and new literal clashing and unification algorithms for applying them, showed significant improvement on several real problems according to several distinct criteria. 22 references, 1 figure.

  18. Data processing

    CERN Document Server

    Fry, T F

    2013-01-01

    Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

  19. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  20. Development of automatic techniques for GPS data management

    International Nuclear Information System (INIS)

    Park, Pil Ho

    2001-06-01

    It is necessary for GPS center to establish automatization as effective management of GPS network including data gathering, data transformation, data backup, data sending to IGS (International GPS Service for geodynamics), and precise ephemerides gathering. The operating program of GPS center has been adopted at KCSC (Korea Cadastral Survey Corporation), NGI (National Geography Institute), MOMAF (Ministry of Maritime Affairs and Fisheries) without self-development of core technique. The automatic management of GPS network is consists of GPS data management and data processing. It is also fundamental technique, which should be accomplished by every GPS centers. Therefore, this study carried out analyzing of Japanese GPS center, which has accomplished automatization by module considering applicability for domestic GPS centers

  1. Semi-automatic Data Integration using Karma

    Science.gov (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  2. Automatic recognition of lactating sow behaviors through depth image processing

    Science.gov (United States)

    Manual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shiftin...

  3. DEVELOPMENT AND TESTING OF GEO-PROCESSING MODELS FOR THE AUTOMATIC GENERATION OF REMEDIATION PLAN AND NAVIGATION DATA TO USE IN INDUSTRIAL DISASTER REMEDIATION

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines. Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long, 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect. The second model shows only 1% difference with the variation of feature number; so this last is less interesting for

  4. [Use of the Elektronika-T3-16M special-purpose computer for the automatic processing of cytophotometric and cytofluorimetric data].

    Science.gov (United States)

    Loktionov, A S; Prianishnikov, V A

    1981-05-01

    A system has been proposed to provide the automatic analysis of data on: a) point cytophotometry, b) two-wave cytophotometry, c) cytofluorimetry. The system provides the input of the data from a photomultiplier to a specialized computer "Electronica-T3-16M" in addition to the simultaneous statistical analysis of these. The information on the programs used is presented. The advantages of the system, compared with some commercially available cytophotometers, are indicated.

  5. Data processing

    International Nuclear Information System (INIS)

    Cousot, P.

    1988-01-01

    The 1988 progress report of the Data Processing laboratory (Polytechnic School, France), is presented. The laboratory research fields are: the semantics, the tests and the semantic analysis of the codes, the formal calculus, the software applications, the algorithms, the neuron networks and VLSI (Very Large Scale Integration). The investigations concerning the polynomial rings are performed by means of the standard basis approach. Among the research topics, the Pascal codes, the parallel processing, the combinatorial, statistical and asymptotic properties of the fundamental data processing tools, the signal processing and the pattern recognition. The published papers, the congress communications and the thesis are also included [fr

  6. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  7. Modeling of a data exchange process in the Automatic Process Control System on the base of the universal SCADA-system

    Science.gov (United States)

    Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.

    2016-04-01

    In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.

  8. Control of automatic processes: A parallel distributed-processing account of the Stroop effect. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.D.; Dunbar, K.; McClelland, J.L.

    1989-11-22

    A growing body of evidence suggests that traditional views of automaticity are in need of revision. For example, automaticity has often been treated as an all-or-none phenomenon, and traditional theories have held that automatic processes are independent of attention. Yet recent empirical data suggest that automatic processes are continuous, and furthermore are subject to attentional control. In this paper we present a model of attention which addresses these issues. Using a parallel distributed processing framework we propose that the attributes of automaticity depend upon the strength of a processing pathway and that strength increases with training. Using the Stroop effect as an example, we show how automatic processes are continuous and emerge gradually with practice. Specifically, we present a computational model of the Stroop task which simulates the time course of processing as well as the effects of learning. This was accomplished by combining the cascade mechanism described by McClelland (1979) with the back propagation learning algorithm (Rumelhart, Hinton, Williams, 1986). The model is able to simulate performance in the standard Stroop task, as well as aspects of performance in variants of this task which manipulate SOA, response set, and degree of practice. In the discussion we contrast our model with other models, and indicate how it relates to many of the central issues in the literature on attention, automaticity, and interference.

  9. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  10. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  11. Automatic protein structure solution from weak X-ray data

    Science.gov (United States)

    Skubák, Pavol; Pannu, Navraj S.

    2013-11-01

    Determining new protein structures from X-ray diffraction data at low resolution or with a weak anomalous signal is a difficult and often an impossible task. Here we propose a multivariate algorithm that simultaneously combines the structure determination steps. In tests on over 140 real data sets from the protein data bank, we show that this combined approach can automatically build models where current algorithms fail, including an anisotropically diffracting 3.88 Å RNA polymerase II data set. The method seamlessly automates the process, is ideal for non-specialists and provides a mathematical framework for successfully combining various sources of information in image processing.

  12. Semi-automatic film processing unit

    International Nuclear Information System (INIS)

    Mohamad Annuar Assadat Husain; Abdul Aziz Bin Ramli; Mohd Khalid Matori

    2005-01-01

    The design concept applied in the development of an semi-automatic film processing unit needs creativity and user support in channelling the required information to select materials and operation system that suit the design produced. Low cost and efficient operation are the challenges that need to be faced abreast with the fast technology advancement. In producing this processing unit, there are few elements which need to be considered in order to produce high quality image. Consistent movement and correct time coordination for developing and drying are a few elements which need to be controlled. Other elements which need serious attentions are temperature, liquid density and the amount of time for the chemical liquids to react. Subsequent chemical reaction that take place will cause the liquid chemical to age and this will adversely affect the quality of image produced. This unit is also equipped with liquid chemical drainage system and disposal chemical tank. This unit would be useful in GP clinics especially in rural area which practice manual system for developing and require low operational cost. (Author)

  13. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  14. On the Control of Automatic Processes: A Parallel Distributed Processing Model of the Stroop Effect

    Science.gov (United States)

    1988-06-16

    F.N. (1973). The Stroop phenomenon and its use in the study of perceptual, cognitive , and response processes. Memory and Cognition , 1, 106-120. Gatti...189-207. Logan, G.D. (1980). Attention and automaticity in Stroop and priming tasks: Theory and data. Cognitive Psychology, 12, 523-553. Logan, D.G...Dlh’i! FILE COI’_ C0 ON THE CONTROL OF AUTOMATIC PROCESSES: (N A PARALLEL DISTRIBUTED PROCESSING MODEL OF THE STROOP EFFECT Technical Report AIP - 40

  15. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.

    1975-01-01

    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  16. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas

    2016-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for over a year. However, with the forthcoming start of run-2 and its expected workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing data imbalances as well as storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only ...

  17. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00473067; The ATLAS collaboration; Serfon, Cedric; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas; Javurek, Tomas

    2017-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration, has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence...

  18. Automatic processing of multimodal tomography datasets.

    Science.gov (United States)

    Parsons, Aaron D; Price, Stephen W T; Wadeson, Nicola; Basham, Mark; Beale, Andrew M; Ashton, Alun W; Mosselmans, J Frederick W; Quinn, Paul D

    2017-01-01

    With the development of fourth-generation high-brightness synchrotrons on the horizon, the already large volume of data that will be collected on imaging and mapping beamlines is set to increase by orders of magnitude. As such, an easy and accessible way of dealing with such large datasets as quickly as possible is required in order to be able to address the core scientific problems during the experimental data collection. Savu is an accessible and flexible big data processing framework that is able to deal with both the variety and the volume of data of multimodal and multidimensional scientific datasets output such as those from chemical tomography experiments on the I18 microfocus scanning beamline at Diamond Light Source.

  19. Data Processing

    Science.gov (United States)

    Grangeat, P.

    A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

  20. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive

  1. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    T. Tsikrika (Theodora); C. Diou; A.P. de Vries (Arjen); A. Delopoulos

    2010-01-01

    htmlabstractAutomatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the

  2. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  3. Multichannel display system with automatic sequential output of analog data

    International Nuclear Information System (INIS)

    Bykovskii, Yu.A.; Gruzinov, A.E.; Lagoda, V.B.

    1989-01-01

    The authors describe a device that, with maximum simplicity and autonomy, permits parallel data display from 16 measuring channels with automatic output to the screen of a storage oscilloscope in ∼ 50 μsec. The described device can be used to study the divergence characteristics of the ion component of plasma sources and in optical and x-ray spectroscopy of pulsed processes. Owing to its compactness and autonomy, the device can be located in the immediate vicinity of the detectors (for example, inside a vacuum chamber), which allows the number of vacuum electrical lead-ins and the induction level to be reduced

  4. AUTOCLASS III - AUTOMATIC CLASS DISCOVERY FROM DATA

    Science.gov (United States)

    Cheeseman, P. C.

    1994-01-01

    The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5

  5. Multimedia data mining for automatic diabetic retinopathy screening.

    Science.gov (United States)

    Quellec, Gwénolé; Lamard, Mathieu; Cochener, Béatrice; Decencière, Etienne; Lay, Bruno; Chabouis, Agnès; Roux, Christian; Cazuguel, Guy

    2013-01-01

    This paper presents TeleOphta, an automatic system for screening diabetic retinopathy in teleophthalmology networks. Its goal is to reduce the burden on ophthalmologists by automatically detecting non referable examination records, i.e. examination records presenting no image quality problems and no pathological signs related to diabetic retinopathy or any other retinal pathology. TeleOphta is an attempt to put into practice years of algorithmic developments from our groups. It combines image quality metrics, specific lesion detectors and a generic pathological pattern miner to process the visual content of eye fundus photographs. This visual information is further combined with contextual data in order to compute an abnormality risk for each examination record. The TeleOphta system was trained and tested on a large dataset of 25,702 examination records from the OPHDIAT screening network in Paris. It was able to automatically detect 68% of the non referable examination records while achieving the same sensitivity as a second ophthalmologist. This suggests that it could safely reduce the burden on ophthalmologists by 56%.

  6. Beyond behaviorism: on the automaticity of higher mental processes.

    Science.gov (United States)

    Bargh, J A; Ferguson, M J

    2000-11-01

    The first 100 years of experimental psychology were dominated by 2 major schools of thought: behaviorism and cognitive science. Here the authors consider the common philosophical commitment to determinism by both schools, and how the radical behaviorists' thesis of the determined nature of higher mental processes is being pursued today in social cognition research on automaticity. In harmony with "dual process" models in contemporary cognitive science, which equate determined processes with those that are automatic and which require no intervening conscious choice or guidance, as opposed to "controlled" processes which do, the social cognition research on the automaticity of higher mental processes provides compelling evidence for the determinism of those processes. This research has revealed that social interaction, evaluation and judgment, and the operation of internal goal structures can all proceed without the intervention of conscious acts of will and guidance of the process.

  7. Fast and automatic thermographic material identification for the recycling process

    Science.gov (United States)

    Haferkamp, Heinz; Burmester, Ingo

    1998-03-01

    Within the framework of the future closed loop recycling process the automatic and economical sorting of plastics is a decisive element. The at the present time available identification and sorting systems are not yet suitable for the sorting of technical plastics since essential demands, as the realization of high recognition reliability and identification rates considering the variety of technical plastics, can not be guaranteed. Therefore the Laser Zentrum Hannover e.V. in cooperation with the Hoerotron GmbH and the Preussag Noell GmbH has carried out investigations on a rapid thermographic and laser-supported material- identification-system for automatic material-sorting- systems. The automatic identification of different engineering plastics coming from electronic or automotive waste is possible. Identification rates up to 10 parts per second are allowed by the effort from fast IR line scanners. The procedure is based on the following principle: within a few milliseconds a spot on the relevant sample is heated by a CO2 laser. The samples different and specific chemical and physical material properties cause different temperature distributions on their surfaces that are measured by a fast IR-linescan system. This 'thermal impulse response' has to be analyzed by means of a computer system. Investigations have shown that it is possible to analyze more than 18 different sorts of plastics at a frequency of 10 Hz. Crucial for the development of such a system is the rapid processing of imaging data, the minimization of interferences caused by oscillating samples geometries, and a wide range of possible additives in plastics in question. One possible application area is sorting of plastics coming from car- and electronic waste recycling.

  8. Neural Correlates of Automatic and Controlled Auditory Processing in Schizophrenia

    Science.gov (United States)

    Morey, Rajendra A.; Mitchell, Teresa V.; Inan, Seniha; Lieberman, Jeffrey A.; Belger, Aysenil

    2009-01-01

    Individuals with schizophrenia demonstrate impairments in selective attention and sensory processing. The authors assessed differences in brain function between 26 participants with schizophrenia and 17 comparison subjects engaged in automatic (unattended) and controlled (attended) auditory information processing using event-related functional MRI. Lower regional neural activation during automatic auditory processing in the schizophrenia group was not confined to just the temporal lobe, but also extended to prefrontal regions. Controlled auditory processing was associated with a distributed frontotemporal and subcortical dysfunction. Differences in activation between these two modes of auditory information processing were more pronounced in the comparison group than in the patient group. PMID:19196926

  9. Automatic rebalancing of data in ATLAS distributed data management

    Science.gov (United States)

    Barisits, M.; Serfon, C.; Garonne, V.; Lassnig, M.; Beermann, T.; Javurek, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration, has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence also the workload management and production systems. This contribution describes the concept and architecture behind those components and shows the benefits made by the system.

  10. Digital Data Processing of Images

    African Journals Online (AJOL)

    lend themselves to computer storage, this article will only be concerned with the image enhancement of ... digital computer to quantitate organ function after dynamic studies using the gamma camera will also be ..... an on-line computer is necessary for the automatic analysis of data. The facility to view the dynamic process ...

  11. Automatic testing with digital image processing

    International Nuclear Information System (INIS)

    Daum, W.; Rose, P.; Preuss, M.; Builtjes, J.H.

    1987-01-01

    Only a small part of the various applications of use of image processing in nondestructive materials testing could be presented. Digital image processing is an aid in the evaluation of conventional testing methods as well as in the development of new testing methods. By image improvement, it increases the expressiveness of visual evaluations and makes time consuming evaluation processes easier, especially by means of quantitative image analysis. Image processing contributes a lot to automation by the possibility of interpreting picture information with the help of the computer. (orig./DG) [de

  12. Automatizations processes influence on organizations structure

    Directory of Open Access Journals (Sweden)

    Vace¾ Rastislav

    2003-09-01

    Full Text Available Has been influenced organization structure on processes? If yes, what is the rate? Is approach toward organization structures bordered by aspect of hierarchy? On these and same questions replay that contribution which in detail sight describe uncertainty managing of process in dependence on the type of organization structure.

  13. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....

  14. Estimating spatial travel times using automatic vehicle identification data

    Science.gov (United States)

    2001-01-01

    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  15. On the Control of Automatic Processes: A Parallel Distributed Processing Account of the Stroop Effect

    Science.gov (United States)

    1989-11-22

    identify by bloCk number) FIELD GRU 0- P atmtct cognitive psychology J modelling I Stroop task 19 ABSTRACT (Conw on reverse if neCessay and Iientify by... Stroop phenomenon and its use in the study of perceptual, cognitive , and response processes. Memory and Cognition , 1, 106-120. Fraisse, P. (1969). Why is...Performance, 5, 189-207. Logan, G. D. (1980). Attention and automaticity in Stroop and priming tasks: Theory and data. Cognitive Psychology, 12, 523

  16. Automatic/Control Processing and Attention.

    Science.gov (United States)

    1982-04-01

    LA R. Bock, Education Dept., Univ. of Chicago, Chicago, IL Liaison Scientists, ONR, Branch Office, London, FPO New York L . Bourne, Psychology Dept...Corporation, Santa Monica, CA J. Hoffman, Psychology Dept., Univ. of Delaware, Newark, DE G. Greenwald, Ed., "Human Intelligence Newsletter", Birmingham, MI L ...1Wi Doe B.ntwe. 20 Abstract. cout . subject control, but requires extensive and consistent training to develop. Controlled processing is a coparttLvsl

  17. Data system for automatic flux mapping applications

    International Nuclear Information System (INIS)

    Oates, R.M.; Neuner, J.A.; Couch, R.D. Jr.; Kasinoff, A.M.

    1982-01-01

    This patent discloses interface circuitry for coupling the data from neutron flux detectors in a reactor core to microprocessors. This circuitry minimizes the microprocessor time required to accept data and provides a technique for measuring variable frequency data from the in-core detectors within a minimum amount of hardware and with crystal-controlled accuracy. A frequency link is employed to transmit data with good isolation, and the information is measured using a programmable timer

  18. Automatic data acquisition of anthropological measurements

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O

    1993-01-01

    A computer program in BASIC is presented which enables the input of measurement data from a caliper directly into specific records in a dBASE IV or PARADOX database. The program circumvents the tedious procedure of first recording measurement data manually and then entering the data into a comput...

  19. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  20. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  1. Automatic retrieval of bone fracture knowledge using natural language processing.

    Science.gov (United States)

    Do, Bao H; Wu, Andrew S; Maley, Joan; Biswal, Sandip

    2013-08-01

    Natural language processing (NLP) techniques to extract data from unstructured text into formal computer representations are valuable for creating robust, scalable methods to mine data in medical documents and radiology reports. As voice recognition (VR) becomes more prevalent in radiology practice, there is opportunity for implementing NLP in real time for decision-support applications such as context-aware information retrieval. For example, as the radiologist dictates a report, an NLP algorithm can extract concepts from the text and retrieve relevant classification or diagnosis criteria or calculate disease probability. NLP can work in parallel with VR to potentially facilitate evidence-based reporting (for example, automatically retrieving the Bosniak classification when the radiologist describes a kidney cyst). For these reasons, we developed and validated an NLP system which extracts fracture and anatomy concepts from unstructured text and retrieves relevant bone fracture knowledge. We implement our NLP in an HTML5 web application to demonstrate a proof-of-concept feedback NLP system which retrieves bone fracture knowledge in real time.

  2. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  3. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  4. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    Benkirane, A.; Auger, G.; Chbihi, A.; Bloyet, D.; Plagnol, E.

    1994-01-01

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  5. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  6. Improved automatic tuning of PID controller for stable processes.

    Science.gov (United States)

    Kumar Padhy, Prabin; Majhi, Somanath

    2009-10-01

    This paper presents an improved automatic tuning method for stable processes using a modified relay in the presence of static load disturbances and measurement noise. The modified relay consists of a standard relay in series with a PI controller of unity proportional gain. The integral time constant of the PI controller of the modified relay is chosen so as to ensure a minimum loop phase margin of 30( composite function). A limit cycle is then obtained using the modified relay. Hereafter, the PID controller is designed using the limit cycle output data. The derivative time constant is obtained by maintaining the above mentioned loop phase margin. Minimizing the distance of Nyquist curve of the loop transfer function from the imaginary axis of the complex plane gives the proportional gain. The integral time constant of the PID controller is set equal to the integral time constant of the PI controller of the modified relay. The effectiveness of the proposed technique is verified by simulation results.

  7. The automatic collection and treatment of data for DNC

    International Nuclear Information System (INIS)

    Song Quanxun

    1991-01-01

    The automatic data collection and treatment for DNC (Delayed Neutron Counting) with S-85 MCA (Multi-Channel Analyzers) and PDP-11/34 computer is described. The principle and function of the soft-ware package are introduced in detail

  8. Experience in automatic processing of 340.000 images from ITEF 3-m magnetic spectrometer

    International Nuclear Information System (INIS)

    Dzhelyadin, R.I.; Dukhovskoj, I.A.; Ivanov, L.V.; Kishkurno, V.V.; Krutenkova, A.P.; Kulikov, V.V.; Lyulevich, V.I.; Polikarpov, V.M.; Radkevich, I.A.; Fedorets, V.S.; Fedotov, O.P.

    1974-01-01

    A number of conclusions were made regarding automatic processing of 340.000 pictures (1.020.000 frames) developed on a three-meter magnetic spectrometer with spark chambers. Possibilities for time optimization of automatic processing programs are discussed. The results of processing of a series of photographs were analysed to compare the paramters of automatic ans semi-automatic processing. Some problems relating to organization and technology of picture processing are also autlined [ru

  9. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    CERN Document Server

    Blume, H; Bourenkov, G P; Kosciesza, D; Bartunik, H D

    2001-01-01

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics.

  10. Automatic detection and severity measurement of eczema using image processing.

    Science.gov (United States)

    Alam, Md Nafiul; Munia, Tamanna Tabassum Khan; Tavakolian, Kouhyar; Vasefi, Fartash; MacKinnon, Nick; Fazel-Rezai, Reza

    2016-08-01

    Chronic skin diseases like eczema may lead to severe health and financial consequences for patients if not detected and controlled early. Early measurement of disease severity, combined with a recommendation for skin protection and use of appropriate medication can prevent the disease from worsening. Current diagnosis can be costly and time-consuming. In this paper, an automatic eczema detection and severity measurement model are presented using modern image processing and computer algorithm. The system can successfully detect regions of eczema and classify the identified region as mild or severe based on image color and texture feature. Then the model automatically measures skin parameters used in the most common assessment tool called "Eczema Area and Severity Index (EASI)," by computing eczema affected area score, eczema intensity score, and body region score of eczema allowing both patients and physicians to accurately assess the affected skin.

  11. Automatic auditory intelligence: an expression of the sensory-cognitive core of cognitive processes.

    Science.gov (United States)

    Näätänen, Risto; Astikainen, Piia; Ruusuvirta, Timo; Huotilainen, Minna

    2010-09-01

    In this article, we present a new view on the nature of cognitive processes suggesting that there is a common core, viz., automatic sensory-cognitive processes that form the basis for higher-order cognitive processes. It has been shown that automatic sensory-cognitive processes are shared by humans and various other species and occur at different developmental stages and even in different states of consciousness. This evidence, based on the automatic electrophysiological change-detection response mismatch negativity (MMN), its magnetoencephalographic equivalent MMNm, and behavioral data, indicates that in audition surprisingly complex processes occur automatically and mainly in the sensory-specific cortical regions. These processes include, e.g. stimulus anticipation and extrapolation, sequential stimulus-rule extraction, and pattern and pitch-interval encoding. Furthermore, these complex perceptual-cognitive processes, first found in waking adults, occur similarly even in sleeping newborns, anesthetized animals, and deeply sedated adult humans, suggesting that they form the common perceptual-cognitive core of cognitive processes in general. Although the present evidence originates mainly from the auditory modality, it is likely that analogous evidence could be obtained from other sensory modalities when measures corresponding to those used in the study of the auditory modality become available.

  12. Automatic detection of interictal spikes using data mining models.

    Science.gov (United States)

    Valenti, Pablo; Cazamajou, Enrique; Scarpettini, Marcelo; Aizemberg, Ariel; Silva, Walter; Kochen, Silvia

    2006-01-15

    A prospective candidate for epilepsy surgery is studied both the ictal and interictal spikes (IS) to determine the localization of the epileptogenic zone. In this work, data mining (DM) classification techniques were utilized to build an automatic detection model. The selected DM algorithms are: Decision Trees (J 4.8), and Statistical Bayesian Classifier (naïve model). The main objective was the detection of IS, isolating them from the EEG's base activity. On the other hand, DM has an attractive advantage in such applications, in that the recognition of epileptic discharges does not need a clear definition of spike morphology. Furthermore, previously 'unseen' patterns could be recognized by the DM with proper 'training'. The results obtained showed that the efficacy of the selected DM algorithms is comparable to the current visual analysis used by the experts. Moreover, DM is faster than the time required for the visual analysis of the EEG. So this tool can assist the experts by facilitating the analysis of a patient's information, and reducing the time and effort required in the process.

  13. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    Albrecht, R.W.; Crowe, R.D.; McGuire, J.J.

    1978-01-01

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  14. Automatic Multimedia Creation Enriched with Dynamic Conceptual Data

    Directory of Open Access Journals (Sweden)

    Angel Martín

    2012-12-01

    Full Text Available There is a growing gap between the multimedia production and the context centric multimedia services. The main problem is the under-exploitation of the content creation design. The idea is to support dynamic content generation adapted to the user or display profile. Our work is an implementation of a web platform for automatic generation of multimedia presentations based on SMIL (Synchronized Multimedia Integration Language standard. The system is able to produce rich media with dynamic multimedia content retrieved automatically from different content databases matching the semantic context. For this purpose, we extend the standard interpretation of SMIL tags in order to accomplish a semantic translation of multimedia objects in database queries. This permits services to take benefit of production process to create customized content enhanced with real time information fed from databases. The described system has been successfully deployed to create advanced context centric weather forecasts.

  15. Cognitive effort and pupil dilation in controlled and automatic processes.

    Science.gov (United States)

    Querino, Emanuel; Dos Santos, Lafaiete; Ginani, Giuliano; Nicolau, Eduardo; Miranda, Débora; Romano-Silva, Marco; Malloy-Diniz, Leandro

    2015-01-01

    The Five Digits Test (FDT) is a Stroop paradigm test that aims to evaluate executive functions. It is composed of four parts, two of which are related to automatic and two of which are related to controlled processes. It is known that pupillary diameter increases as the task's cognitive demand increases. In the present study, we evaluated whether the pupillary diameter could distinguish cognitive effort between automated and controlled cognitive processing during the FDT as the task progressed. As a control task, we used a simple reading paradigm with a similar visual aspect as the FDT. We then divided each of the four parts into two blocks in order to evaluate the differences between the first and second half of the task. Results indicated that, compared to a control task, the FDT required higher cognitive effort for each consecutive part. Moreover, the first half of every part of the FDT induced dilation more than the second. The differences in pupil dilation during the first half of the four FDT parts were statistically significant between the parts 2 and 4 (p=0.023), and between the parts 3 and 4 (p=0.006). These results provide further evidence that cognitive effort and pupil diameter can distinguish controlled from automatic processes.

  16. Automaticity of phonological and semantic processing during visual word recognition.

    Science.gov (United States)

    Pattamadilok, Chotiga; Chanoine, Valérie; Pallier, Christophe; Anton, Jean-Luc; Nazarian, Bruno; Belin, Pascal; Ziegler, Johannes C

    2017-04-01

    Reading involves activation of phonological and semantic knowledge. Yet, the automaticity of the activation of these representations remains subject to debate. The present study addressed this issue by examining how different brain areas involved in language processing responded to a manipulation of bottom-up (level of visibility) and top-down information (task demands) applied to written words. The analyses showed that the same brain areas were activated in response to written words whether the task was symbol detection, rime detection, or semantic judgment. This network included posterior, temporal and prefrontal regions, which clearly suggests the involvement of orthographic, semantic and phonological/articulatory processing in all tasks. However, we also found interactions between task and stimulus visibility, which reflected the fact that the strength of the neural responses to written words in several high-level language areas varied across tasks. Together, our findings suggest that the involvement of phonological and semantic processing in reading is supported by two complementary mechanisms. First, an automatic mechanism that results from a task-independent spread of activation throughout a network in which orthography is linked to phonology and semantics. Second, a mechanism that further fine-tunes the sensitivity of high-level language areas to the sensory input in a task-dependent manner. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Can Automatic Classification Help to Increase Accuracy in Data Collection?

    Directory of Open Access Journals (Sweden)

    Frederique Lang

    2016-09-01

    Full Text Available Purpose: The authors aim at testing the performance of a set of machine learning algorithms that could improve the process of data cleaning when building datasets. Design/methodology/approach: The paper is centered on cleaning datasets gathered from publishers and online resources by the use of specific keywords. In this case, we analyzed data from the Web of Science. The accuracy of various forms of automatic classification was tested here in comparison with manual coding in order to determine their usefulness for data collection and cleaning. We assessed the performance of seven supervised classification algorithms (Support Vector Machine (SVM, Scaled Linear Discriminant Analysis, Lasso and elastic-net regularized generalized linear models, Maximum Entropy, Regression Tree, Boosting, and Random Forest and analyzed two properties: accuracy and recall. We assessed not only each algorithm individually, but also their combinations through a voting scheme. We also tested the performance of these algorithms with different sizes of training data. When assessing the performance of different combinations, we used an indicator of coverage to account for the agreement and disagreement on classification between algorithms. Findings: We found that the performance of the algorithms used vary with the size of the sample for training. However, for the classification exercise in this paper the best performing algorithms were SVM and Boosting. The combination of these two algorithms achieved a high agreement on coverage and was highly accurate. This combination performs well with a small training dataset (10%, which may reduce the manual work needed for classification tasks. Research limitations: The dataset gathered has significantly more records related to the topic of interest compared to unrelated topics. This may affect the performance of some algorithms, especially in their identification of unrelated papers. Practical implications: Although the

  18. Current position on software for the automatic data acquisition system

    International Nuclear Information System (INIS)

    1988-01-01

    This report describes the current concepts for software to control the operation of the Automatic Data Acquisition System (ADAS) proposed for the Deaf Smith County, Texas, Exploratory Shaft Facility (ESF). The purpose of this report is to provide conceptual details of how the ADAS software will execute the data acquisition function, and how the software will make collected information available to the test personnel, the Data Management Group (DMG), and other authorized users. It is not intended that this report describe all of the ADAS functions in exact detail, but the concepts included herein will form the basis for the formal ADAS functional requirements definition document. 5 refs., 14 figs

  19. Automatic rebuilding and optimization of crystallographic structures in the Protein Data Bank.

    NARCIS (Netherlands)

    Joosten, R.P.; Joosten, K.; Cohen, S.X.; Vriend, G.; Perrakis, A.

    2011-01-01

    MOTIVATION: Macromolecular crystal structures in the Protein Data Bank (PDB) are a key source of structural insight into biological processes. These structures, some >30 years old, were constructed with methods of their era. With PDB_REDO, we aim to automatically optimize these structures to better

  20. Automatic and controlled processing and the Broad Autism Phenotype.

    Science.gov (United States)

    Camodeca, Amy; Voelker, Sylvia

    2016-01-30

    Research related to verbal fluency in the Broad Autism Phenotype (BAP) is limited and dated, but generally suggests intact abilities in the context of weaknesses in other areas of executive function (Hughes et al., 1999; Wong et al., 2006; Delorme et al., 2007). Controlled processing, the generation of search strategies after initial, automated responses are exhausted (Spat, 2013), has yet to be investigated in the BAP, and may be evidenced in verbal fluency tasks. One hundred twenty-nine participants completed the Delis-Kaplan Executive Function System Verbal Fluency test (D-KEFS; Delis et al., 2001) and the Broad Autism Phenotype Questionnaire (BAPQ; Hurley et al., 2007). The BAP group (n=53) produced significantly fewer total words during the 2nd 15" interval compared to the Non-BAP (n=76) group. Partial correlations indicated similar relations between verbal fluency variables for each group. Regression analyses predicting 2nd 15" interval scores suggested differentiation between controlled and automatic processing skills in both groups. Results suggest adequate automatic processing, but slowed development of controlled processing strategies in the BAP, and provide evidence for similar underlying cognitive constructs for both groups. Controlled processing was predictive of Block Design score for Non-BAP participants, and was predictive of Pragmatic Language score on the BAPQ for BAP participants. These results are similar to past research related to strengths and weaknesses in the BAP, respectively, and suggest that controlled processing strategy use may be required in instances of weak lower-level skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  2. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    International Nuclear Information System (INIS)

    Pichara, Karim; Protopapas, Pavlos

    2013-01-01

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same

  3. Using Dual-Task Methodology to Dissociate Automatic from Nonautomatic Processes Involved in Artificial Grammar Learning

    Science.gov (United States)

    Hendricks, Michelle A.; Conway, Christopher M.; Kellogg, Ronald T.

    2013-01-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and…

  4. Process and equipment for automatic measurement of resonant frequencies in seismic detectors

    International Nuclear Information System (INIS)

    Fredriksson, O.A.; Thomas, E.L.

    1977-01-01

    This is a process for the automatic indication of the resonant frequency of one or more detector elements which have operated inside a geophysical data-gathering system. Geophones or hydrophones or groups of both instruments are to be understood as comprising the detector elements. The invention concerns the creation of a process and of equipment working with laboratory precision, although it can be used in the field. (orig./RW) [de

  5. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  6. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  7. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  8. Automatic Synthesis and Deployment of Intensional Kahn Process Networks

    Science.gov (United States)

    Peralta, Manuel; Mukhopadhyay, Supratik; Bharadwaj, Ramesh

    In this paper we introduce and study, theoretically, a clean slate "formal" foundational approach for developing and deploying high-assurance distributed embedded systems deployed in mission-critical applications. We propose a simple formal distributed asynchronous framework extending Kahn Process Networks with intensional specification. More precisely, we present a model-driven approach based on a platform-independent language and an intensional specification logic that allows us to synthesize distributed agents that can handle interactions with external resources asynchronously, ensure enforcement of information flow and security policies, and have the ability to deal with failures of resources. Our approach allows rapid development and automated deployment of formally verified embedded networked systems that provide guarantees that clients' requirements will be met and QoS guarantees will be respected. Moreover, it allows modeling (and programming) reliable distributed systems for multi-core hosts. Such a capability makes our framework suitable for next generation grid computing systems where multi-core individual hosts need to be utilized for improving scalability.Given an intensional logical specification of a distributed embedded system, that includes Quality of Service (QoS) requirements, a set of software resources and devices available in a network, and their formal interface specifications, a deductive system can automatically generate distributed extended Kahn processes and their deployment information in such a way that the application requirements - including QoS requirements - are guaranteed to be met. The generated processes use the inputs of the sensors/meters/probes and the management policies of the customer to generate real-time control decisions for managing the system. The processes are deployed automatically on a distributed network involving sensors/meters/probes tracking system parameters, actuators controlling devices, and diverse computing

  9. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  10. Automatic Record of IGO Game by Image Processing

    Science.gov (United States)

    Fukuyama, Tadao; Ogisu, Takahiro; Kim, Jim Woo; Kozo, Okazaki

    Although an IGO record is a valuable work, it takes a lot of human laborious work. In this paper, we propose an automatic record system by image processing. First, we control a camera shutter by judging the time of brightness change which is caused by the action of the player's hand, capture the n-th image in order. The contrast among white stone, black stone and board depend on the brightness, which are affected by shadows , etc. To cope with the situation, we use a subtraction image between the board and n-th image. The stones are not always put on the centers of the intersection points and are often drifted when they are touched with another stones or the hand. Light reflects on the surface. The brightness of the image often changes. Therefore, we estimate the intersection points of highlighted and distorted board. The stones recognition are based on template matching using RMS errors and correlations. The results are classified automatically by the application of K-means clustering algorithm. We show the proposed algorithm's effectiveness by experiments.

  11. Automatic solar feature detection using image processing and pattern recognition techniques

    Science.gov (United States)

    Qu, Ming

    The objective of the research in this dissertation is to develop a software system to automatically detect and characterize solar flares, filaments and Corona Mass Ejections (CMEs), the core of so-called solar activity. These tools will assist us to predict space weather caused by violent solar activity. Image processing and pattern recognition techniques are applied to this system. For automatic flare detection, the advanced pattern recognition techniques such as Multi-Layer Perceptron (MLP), Radial Basis Function (RBF), and Support Vector Machine (SVM) are used. By tracking the entire process of flares, the motion properties of two-ribbon flares are derived automatically. In the applications of the solar filament detection, the Stabilized Inverse Diffusion Equation (SIDE) is used to enhance and sharpen filaments; a new method for automatic threshold selection is proposed to extract filaments from background; an SVM classifier with nine input features is used to differentiate between sunspots and filaments. Once a filament is identified, morphological thinning, pruning, and adaptive edge linking methods are applied to determine filament properties. Furthermore, a filament matching method is proposed to detect filament disappearance. The automatic detection and characterization of flares and filaments have been successfully applied on Halpha full-disk images that are continuously obtained at Big Bear Solar Observatory (BBSO). For automatically detecting and classifying CMEs, the image enhancement, segmentation, and pattern recognition techniques are applied to Large Angle Spectrometric Coronagraph (LASCO) C2 and C3 images. The processed LASCO and BBSO images are saved to file archive, and the physical properties of detected solar features such as intensity and speed are recorded in our database. Researchers are able to access the solar feature database and analyze the solar data efficiently and effectively. The detection and characterization system greatly improves

  12. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  13. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  14. Enhancement of the automatic ultrasonic signal processing system using digital technology

    International Nuclear Information System (INIS)

    Koo, In Soo; Park, H. Y.; Suh, Y. S.; Kim, D. Hoon; Huh, S.; Sung, S. H.; Jang, G. S.; Ryoo, S. G.; Choi, J. H.; Kim, Y. H.; Lee, J. C.; Kim, D. Hyun; Park, H. J.; Kim, Y. C.; Lee, J. P.; Park, C. H.; Kim, M. S.

    1999-12-01

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  15. Automatic removal of outliers in hydrologic time series and quality control of rainfall data: processing a real-time database of the Local System for Flood Monitoring in Klodzko County, Poland

    Science.gov (United States)

    Mizinski, Bartlomiej; Niedzielski, Tomasz; Kryza, Maciej; Szymanowski, Mariusz

    2013-04-01

    Real-time hydrological forecasting requires the highest quality of both hydrologic and meteorological data collected in a given river basin. Large outliers may lead to inaccurate predictions, with substantial departures between observations and prognoses considered even in short term. Although we need the correctness of both riverflow and rainfall data, they cannot be processed in the same way to produce a filtered output. Indeed, hydrologic time series at a given gauge can be interpolated in time domain after having detected suspicious values, however if no outlier has been detected at the upstream sites. In the case of rainfall data, interpolation is not suitable as we cannot verify the potential outliers at a given site against data from other sites especially in the complex terrain. This is due to the fact that very local convective events may occur, leading to large rainfall peaks at a limited space. Hence, instead of interpolating data, we rather perform a flagging procedure that only ranks outliers according to the likelihood of occurrence. Following the aforementioned assumptions, we have developed a few modules that serve a purpose of a fully automated correction of a database that is updated in real-time every 15 minutes, and the main objective of the work was to produce a high-quality database for a purpose of hydrologic rainfall-runoff modeling and ensemble prediction. The database in question is available courtesy of the County Office in Kłodzko (SW Poland), the institution which owns and maintains the Local System for Flood Monitoring in Kłodzko County. The dedicated prediction system, known as HydroProg, is now being built at the University of Wrocław (Poland). As the entire prediction system, the correction modules work automatically in real time and are developed in R language. They are plugged in to a larger IT infrastructure. Hydrologic time series, which are water levels recorded every 15 minutes at 22 gauges located in Kłodzko County, are

  16. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  17. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database...... by means of spatial resection. This paper describes in details the mentioned procedure as it was used and implemented during tests with two data sets from Denmark. Moreover, the results from a test made with a data set from the Czech Republic are added. It brought a different view to this complex...... of problems with respect to a different landscape and quality of input data. Finally, some ideas for improving and generalising the method are suggested....

  18. Data processing made simple

    CERN Document Server

    Wooldridge, Susan

    2013-01-01

    Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

  19. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  20. Automatic processing of unattended object features by functional connectivity

    Directory of Open Access Journals (Sweden)

    Katja Martina Mayer

    2013-05-01

    Full Text Available Observers can selectively attend to object features that are relevant for a task. However, unattended task-irrelevant features may still be processed and possibly integrated with the attended features. This study investigated the neural mechanisms for processing both task-relevant (attended and task-irrelevant (unattended object features. The Garner paradigm was adapted for functional magnetic resonance imaging (fMRI to test whether specific brain areas process the conjunction of features or whether multiple interacting areas are involved in this form of feature integration. Observers attended to shape, colour, or non-rigid motion of novel objects while unattended features changed from trial to trial (change blocks or remained constant (no-change blocks during a given block. This block manipulation allowed us to measure the extent to which unattended features affected neural responses which would reflect the extent to which multiple object features are automatically processed. We did not find Garner interference at the behavioural level. However, we designed the experiment to equate performance across block types so that any fMRI results could not be due solely to differences in task difficulty between change and no-change blocks. Attention to specific features localised several areas known to be involved in object processing. No area showed larger responses on change blocks compared to no-change blocks. However, psychophysiological interaction analyses revealed that several functionally-localised areas showed significant positive interactions with areas in occipito-temporal and frontal areas that depended on block type. Overall, these findings suggest that both regional responses and functional connectivity are crucial for processing multi-featured objects.

  1. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    events. Due to great variation in events, this method often fails to detect biologically relevant pressure variations. We have tried to develop a new concept for recognition of pressure events based on a neural network. Pressures were recorded for over 23 hours in 29 normal volunteers by means...... comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....... of a portable data recording system. A number of pressure events and non-events were selected from 9 recordings and used for training the network. The performance of the trained network was then verified on recordings from the remaining 20 volunteers. The accuracy and sensitivity of the two systems were...

  2. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  3. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  4. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  5. Automatic Weissenberg data collection system for time-resolved protein crystallography

    CERN Document Server

    Sakabe, N; Higashi, T; Igarashi, N; Suzuki, M; Watanabe, N; Sasaki, K

    2001-01-01

    A totally new type of fully automatic Weissenberg data-collection system called 'Galaxy' was developed and was installed at the Photon Factory. This automatic data collection system consists of a rotated-inclined focusing monochromator, a screenless Weissenberg type camera, an image reader, an eraser, a cassette transportation mechanism, a control console and a safety and high-speed computer network system linking a control console, data processing computers and data servers. The special characteristics of this system are a Weissenberg camera with a fully cylindrical cassette which can be rotated to exchange a frame, a maximum number of 36 images to be recorded in an IP cassette, and a very high speed IP reader with five reading heads. Since the frame exchange time is only a few seconds, this system is applicable for time-resolved protein crystallography at seconds or minutes of time-scale.

  6. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    Science.gov (United States)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  7. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  8. Automatic processing of CERN video, audio and photo archives

    International Nuclear Information System (INIS)

    Kwiatek, M

    2008-01-01

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services

  9. Data process of liquid scintillation counting

    International Nuclear Information System (INIS)

    Ishikawa, Hiroaki; Kuwajima, Susumu.

    1975-01-01

    The use of liquid scintillation counting system has been significantly spread because automatic sample changers and printers have recently come to be incorporated. However, the system will be systematized completely if automatic data processing and the sample preparation of radioactive materials to be measured are realized. Dry or wet oxidation method is applied to the sample preparation when radioactive materials are hard to dissolve into scintillator solution. Since these several years, the automatic sample combustion system, in which the dry oxidation is automated, has been rapidly spread and serves greatly to labor saving. Since the printers generally indicate only counted number, data processing system has been developed, and speeded up calculating process, which automatically corrects quenching of samples for obtaining the final radioactivity required. The data processing system is roughly divided into on-line and off-line systems according to whether computers are connected directly or indirectly, while its hardware is classified to input, calculating and output devices. Also, the calculation to determine sample activity by external standard method is explained. (Wakatsuki, Y.)

  10. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  11. Automatic Extraction of Road Markings from Mobile Laser Scanning Data

    Science.gov (United States)

    Ma, H.; Pei, Z.; Wei, Z.; Zhong, R.

    2017-09-01

    Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS) and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS) system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.

  12. AUTOMATIC EXTRACTION OF ROAD MARKINGS FROM MOBILE LASER SCANNING DATA

    Directory of Open Access Journals (Sweden)

    H. Ma

    2017-09-01

    Full Text Available Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.

  13. Measuring Service Reliability Using Automatic Vehicle Location Data

    Directory of Open Access Journals (Sweden)

    Zhenliang Ma

    2014-01-01

    Full Text Available Bus service reliability has become a major concern for both operators and passengers. Buffer time measures are believed to be appropriate to approximate passengers' experienced reliability in the context of departure planning. Two issues with regard to buffer time estimation are addressed, namely, performance disaggregation and capturing passengers’ perspectives on reliability. A Gaussian mixture models based method is applied to disaggregate the performance data. Based on the mixture models distribution, a reliability buffer time (RBT measure is proposed from passengers’ perspective. A set of expected reliability buffer time measures is developed for operators by using different spatial-temporal levels combinations of RBTs. The average and the latest trip duration measures are proposed for passengers that can be used to choose a service mode and determine the departure time. Using empirical data from the automatic vehicle location system in Brisbane, Australia, the existence of mixture service states is verified and the advantage of mixture distribution model in fitting travel time profile is demonstrated. Numerical experiments validate that the proposed reliability measure is capable of quantifying service reliability consistently, while the conventional ones may provide inconsistent results. Potential applications for operators and passengers are also illustrated, including reliability improvement and trip planning.

  14. Data processing on FPGAs

    CERN Document Server

    Teubner, Jens

    2013-01-01

    Roughly a decade ago, power consumption and heat dissipation concerns forced the semiconductor industry to radically change its course, shifting from sequential to parallel computing. Unfortunately, improving performance of applications has now become much more difficult than in the good old days of frequency scaling. This is also affecting databases and data processing applications in general, and has led to the popularity of so-called data appliances-specialized data processing engines, where software and hardware are sold together in a closed box. Field-programmable gate arrays (FPGAs) incr

  15. Lateralized automatic auditory processing of phonetic versus musical information: a PET study.

    Science.gov (United States)

    Tervaniemi, M; Medvedev, S V; Alho, K; Pakhomov, S V; Roudas, M S; Van Zuijen, T L; Näätänen, R

    2000-06-01

    Previous positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) studies show that during attentive listening, processing of phonetic information is associated with higher activity in the left auditory cortex than in the right auditory cortex while the opposite is true for musical information. The present PET study determined whether automatically activated neural mechanisms for phonetic and musical information are lateralized. To this end, subjects engaged in a visual word classification task were presented with phonetic sound sequences consisting of frequent (P = 0.8) and infrequent (P = 0.2) phonemes and with musical sound sequences consisting of frequent (P = 0.8) and infrequent (P = 0.2) chords. The phonemes and chords were matched in spectral complexity as well as in the magnitude of frequency difference between the frequent and infrequent sounds (/e/ vs. /o/; A major vs. A minor). In addition, control sequences, consisting of either frequent (/e/; A major) or infrequent sounds (/o/; A minor) were employed in separate blocks. When sound sequences consisted of intermixed frequent and infrequent sounds, automatic phonetic processing was lateralized to the left hemisphere and musical to the right hemisphere. This lateralization, however, did not occur in control blocks with one type of sound (frequent or infrequent). The data thus indicate that automatic activation of lateralized neuronal circuits requires sound comparison based on short-term sound representations.

  16. Using suggestion to modulate automatic processes: from Stroop to McGurk and beyond.

    Science.gov (United States)

    Lifshitz, Michael; Aubert Bonn, Noémie; Fischer, Alexandra; Kashem, Irene Farah; Raz, Amir

    2013-02-01

    Cognitive scientists typically classify cognitive processes as either controlled or automatic. Whereas controlled processes are slow and effortful, automatic processes are fast and involuntary. Over the past decade, we have propelled a research trajectory investigating how top-down influence in the form of suggestion can allow individuals to modulate the automaticity of cognitive processes. Here we present an overarching array of converging findings that collectively indicate that certain individuals can derail involuntary processes, such as reading, by "unringing" the proverbial bell. We examine replications of these effects from both our own laboratory and independent groups, and extend our Stroop findings to several other well-established automatic paradigms, including the McGurk effect. We thus demonstrate how, in the case of highly suggestible individuals, suggestion seems to wield control over a process that is likely even more automatic than the Stroop effect. Finally, we present findings from two novel experimental paradigms exploring the potential of shifting automaticity in the opposite direction - i.e., transforming, without practice, a controlled task into one that is automatic. Drawing on related evidence from the neuroscience of contemplative practices, we discuss how these findings pave the road to a more scientific understanding of voluntary control and automaticity, and expound on their possible experimental and therapeutic applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Automatic Indoor Building Reconstruction from Mobile Laser Scanning Data

    Science.gov (United States)

    Xie, L.; Wang, R.

    2017-09-01

    Indoor reconstruction from point clouds is a hot topic in photogrammetry, computer vision and computer graphics. Reconstructing indoor scene from point clouds is challenging due to complex room floorplan and line-of-sight occlusions. Most of existing methods deal with stationary terrestrial laser scanning point clouds or RGB-D point clouds. In this paper, we propose an automatic method for reconstructing indoor 3D building models from mobile laser scanning point clouds. The method includes 2D floorplan generation, 3D building modeling, door detection and room segmentation. The main idea behind our approach is to separate wall structure into two different types as the inner wall and the outer wall based on the observation of point distribution. Then we utilize a graph cut based optimization method to solve the labeling problem and generate the 2D floorplan based on the optimization result. Subsequently, we leverage an ?-shape based method to detect the doors on the 2D projected point clouds and utilize the floorplan to segment the individual room. The experiments show that this door detection method can achieve a recognition rate at 97% and the room segmentation method can attain the correct segmentation results. We also evaluate the reconstruction accuracy on the synthetic data, which indicates the accuracy of our method is comparable to the state-of-the art.

  18. Automatic neural processing of disorder-related stimuli in Social Anxiety Disorder (SAD: Faces and more

    Directory of Open Access Journals (Sweden)

    Claudia eSchulz

    2013-05-01

    Full Text Available It has been proposed that social anxiety disorder (SAD is associated with automatic information processing biases resulting in hypersensitivity to signals of social threat such as negative facial expressions. However, the nature and extent of automatic processes in SAD on the behavioral and neural level is not entirely clear yet. The present review summarizes neuroscientific findings on automatic processing of facial threat but also other disorder-related stimuli such as emotional prosody or negative words in SAD. We review initial evidence for automatic activation of the amygdala, insula, and sensory cortices as well as for automatic early electrophysiological components. However, findings vary depending on tasks, stimuli, and neuroscientific methods. Only few studies set out to examine automatic neural processes directly and systematic attempts are as yet lacking. We suggest that future studies should (1 use different stimulus modalities, (2 examine different emotional expressions, (3 compare findings in SAD with other anxiety disorders, (4 use more sophisticated experimental designs to investigate features of automaticity systematically, and (5 combine different neuroscientific methods (such as functional neuroimaging and electrophysiology. Finally, the understanding of neural automatic processes could also provide hints for therapeutic approaches.

  19. The Customized Automatic Processing Framework for HY-2A Satellite Marine Advanced Products

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2016-12-01

    Full Text Available HY-2A, as the first Chinese ocean dynamic environment satellite, provides an effective and efficient way of observing ocean properties. However, in the operational stage, some inconveniences of the existing ground application system have appeared. Based on the review of users’ requirements for data services, the Customized Automatic Processing Framework (CAPF for HY-2A advanced products is proposed and has been developed. As an extension of the existing ground application system, the framework provides interfaces for adding customized algorithms, designing on-demand processing workflows, and scheduling the processing procedures. With the customized processing templates, the framework allows users to easily process the products according to their own expectations, which facilitates the usage of HY-2A satellite advanced products.

  20. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  1. Image processing applied to automatic detection of defects during ultrasonic examination

    International Nuclear Information System (INIS)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces

  2. Automatic system of production, transfer and processing of coin targets for the production of metallic radioisotopes

    Science.gov (United States)

    Pellicioli, M.; Ouadi, A.; Marchand, P.; Foehrenbacher, T.; Schuler, J.; Dick-Schuler, N.; Brasse, D.

    2017-05-01

    The work presented in this paper gathers three main technical developments aiming at 1) optimizing nuclide production by the mean of solid targets 2) automatically transferring coin targets from vault to hotcell without human intervention 3) processing target dilution and purification in hotcell automatically. This system has been installed on a ACSI TR24 cyclotron in Strasbourg France.

  3. The Masked Semantic Priming Effect Is Task Dependent: Reconsidering the Automatic Spreading Activation Process

    Science.gov (United States)

    de Wit, Bianca; Kinoshita, Sachiko

    2015-01-01

    Semantic priming effects are popularly explained in terms of an automatic spreading activation process, according to which the activation of a node in a semantic network spreads automatically to interconnected nodes, preactivating a semantically related word. It is expected from this account that semantic priming effects should be routinely…

  4. Image Processing Method for Automatic Discrimination of Hoverfly Species

    Directory of Open Access Journals (Sweden)

    Vladimir Crnojević

    2014-01-01

    Full Text Available An approach to automatic hoverfly species discrimination based on detection and extraction of vein junctions in wing venation patterns of insects is presented in the paper. The dataset used in our experiments consists of high resolution microscopic wing images of several hoverfly species collected over a relatively long period of time at different geographic locations. Junctions are detected using the combination of the well known HOG (histograms of oriented gradients and the robust version of recently proposed CLBP (complete local binary pattern. These features are used to train an SVM classifier to detect junctions in wing images. Once the junctions are identified they are used to extract statistics characterizing the constellations of these points. Such simple features can be used to automatically discriminate four selected hoverfly species with polynomial kernel SVM and achieve high classification accuracy.

  5. A dual growing method for the automatic extraction of individual trees from mobile laser scanning data

    Science.gov (United States)

    Li, Lin; Li, Dalin; Zhu, Haihong; Li, You

    2016-10-01

    Street trees interlaced with other objects in cluttered point clouds of urban scenes inhibit the automatic extraction of individual trees. This paper proposes a method for the automatic extraction of individual trees from mobile laser scanning data, according to the general constitution of trees. Two components of each individual tree - a trunk and a crown can be extracted by the dual growing method. This method consists of coarse classification, through which most of artifacts are removed; the automatic selection of appropriate seeds for individual trees, by which the common manual initial setting is avoided; a dual growing process that separates one tree from others by circumscribing a trunk in an adaptive growing radius and segmenting a crown in constrained growing regions; and a refining process that draws a singular trunk from the interlaced other objects. The method is verified by two datasets with over 98% completeness and over 96% correctness. The low mean absolute percentage errors in capturing the morphological parameters of individual trees indicate that this method can output individual trees with high precision.

  6. Image processing techniques for remote sensing data

    Digital Repository Service at National Institute of Oceanography (India)

    RameshKumar, M.R.

    interpretation and for processing of scene data for autonomous machine perception. The technique of digital image processing are used for' automatic character/pattern recognition, industrial robots for product assembly and inspection, military recognizance...-Type text/plain; charset=UTF-8 4. IMAGE PROCE:>SINGTOO~IQUE3FOR RmOTE SmSING DATA M. R. RAIirnH KUMAR National Institute of Oceanography, Dona PaUla, Goa-403004. Digital image processing is used for improvement of pictorial information for human...

  7. Online data processing system

    International Nuclear Information System (INIS)

    Nakahara, Yoshinori; Yagi, Hideyuki; Yamada, Takayuki

    1979-02-01

    A pulse height analyzer terminal system PHATS has been developed for online data processing via JAERI-TOKAI computer network. The system is controled by using a micro-computer MICRO-8 which was developed for the JAERI-TOKAI network. The system program consists of two subprograms, online control system ONLCS and pulse height analyzer control system PHACS. ONLCS links the terminal with the conversational programming system of FACOM 230/75 through the JAERI-TOKAI network and controls data processing in TSS and remote batch modes. PHACS is used to control INPUT/OUTPUT of data between pulse height analyzer and cassette-MT or typewriter. This report describes the hardware configuration and the system program in detail. In the appendix, explained are real time monitor, type of message, PEX to PEX protocol and Host to Host protocol, required for the system programming. (author)

  8. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  9. Intelligent radar data processing

    Science.gov (United States)

    Holzbaur, Ulrich D.

    The application of artificial intelligence principles to the processing of radar signals is considered theoretically. The main capabilities required are learning and adaptation in a changing environment, processing and modeling information (especially dynamics and uncertainty), and decision-making based on all available information (taking its reliability into account). For the application to combat-aircraft radar systems, the tasks include the combination of data from different types of sensors, reacting to electronic counter-countermeasures, evaluation of how much data should be acquired (energy and radiation management), control of the radar, tracking, and identification. Also discussed are related uses such as monitoring the avionics systems, supporting pilot decisions with respect to the radar system, and general applications in radar-system R&D.

  10. Big Data technology in traffic: A case study of automatic counters

    Directory of Open Access Journals (Sweden)

    Janković Slađana R.

    2016-01-01

    Full Text Available Modern information and communication technologies together with intelligent devices provide a continuous inflow of large amounts of data that are used by traffic and transport systems. Collecting traffic data does not represent a challenge nowadays, but the issues remains in relation to storing and processing increasing amounts of data. In this paper we have investigated the possibilities of using Big Data technology to store and process data in the transport domain. The term Big Data refers to a large volume of information resource, its velocity and variety, far beyond the capabilities of commonly used software for storing, processing and data management. In our case study, Apache™ Hadoop® Big Data was used for processing data collected from 10 automatic traffic counters set up in Novi Sad and its surroundings. Indicators of traffic load which were calculated using the Big Data platforms were presented using tables and graphs in Microsoft Office Excel tool. The visualization and geolocation of the obtained indicators were performed using the Microsoft Business Intelligence (BI tools such as: Excel Power View and Excel Power Map. This case study showed that Big Data technologies combined with the BI tools can be used as a reliable support in monitoring of the traffic management systems.

  11. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  12. The N400 and Late Positive Complex (LPC Effects Reflect Controlled Rather than Automatic Mechanisms of Sentence Processing

    Directory of Open Access Journals (Sweden)

    Boris Kotchoubey

    2012-08-01

    Full Text Available This study compared automatic and controlled cognitive processes that underlie event-related potentials (ERPs effects during speech perception. Sentences were presented to French native speakers, and the final word could be congruent or incongruent, and presented at one of four levels of degradation (using a modulation with pink noise: no degradation, mild degradation (2 levels, or strong degradation. We assumed that degradation impairs controlled more than automatic processes. The N400 and Late Positive Complex (LPC effects were defined as the differences between the corresponding wave amplitudes to incongruent words minus congruent words. Under mild degradation, where controlled sentence-level processing could still occur (as indicated by behavioral data, both N400 and LPC effects were delayed and the latter effect was reduced. Under strong degradation, where sentence processing was rather automatic (as indicated by behavioral data, no ERP effect remained. These results suggest that ERP effects elicited in complex contexts, such as sentences, reflect controlled rather than automatic mechanisms of speech processing. These results differ from the results of experiments that used word-pair or word-list paradigms.

  13. Automatic tracking of wake vortices using ground-wind sensor data

    Science.gov (United States)

    1977-01-03

    Algorithms for automatic tracking of wake vortices using ground-wind anemometer : data are developed. Methods of bad-data suppression, track initiation, and : track termination are included. An effective sensor-failure detection-and identification : ...

  14. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  15. Semi-Automatic Selection of Ground Control Points for High Resolution Remote Sensing Data in Urban Areas

    Directory of Open Access Journals (Sweden)

    Gulbe Linda

    2016-12-01

    Full Text Available Geometrical accuracy of remote sensing data often is ensured by geometrical transforms based on Ground Control Points (GCPs. Manual selection of GCP is a time-consuming process, which requires some sort of automation. Therefore, the aim of this study is to present and evaluate methodology for easier, semi-automatic selection of ground control points for urban areas. Custom line scanning algorithm was implemented and applied to data in order to extract potential GCPs for an image analyst. The proposed method was tested for classical orthorectification and special object polygon transform. Results are convincing and show that in the test case semi-automatic methodology is able to correct locations of 70 % (thermal data – 80 % (orthophoto images of buildings. Geometrical transform for subimages of approximately 3 hectares with approximately 12 automatically found GCPs resulted in RSME approximately 1 meter with standard deviation of 1.2 meters.

  16. An Automatic Framework Using Space-Time Processing and TR-MUSIC for Subsurface and Through-Wall Multitarget Imaging

    Directory of Open Access Journals (Sweden)

    Si-hao Tan

    2012-01-01

    Full Text Available We present an automatic framework combined space-time signal processing with Time Reversal electromagnetic (EM inversion for subsurface and through-wall multitarget imaging using electromagnetic waves. This framework is composed of a frequency-wavenumber (FK filter to suppress direct wave and medium bounce, a FK migration algorithm to automatically estimate the number of targets and identify target regions, which can be used to reduce the computational complexity of the following imaging algorithm, and a EM inversion algorithm using Time Reversal Multiple Signal Classification (TR-MUSIC to reconstruct hidden objects. The feasibility of the framework is demonstrated with simulated data generated by GPRMAX.

  17. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad

    2017-09-27

    Optimizing the performance of big-data streaming applications has become a daunting and time-consuming task: parameters may be tuned from a space of hundreds or even thousands of possible configurations. In this paper, we present a framework for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing three benchmark applications in Apache Storm. Our results show that a hill-climbing algorithm that uses a new heuristic sampling approach based on Latin Hypercube provides the best results. Our gray-box algorithm provides comparable results while being two to five times faster.

  18. Study on automatic ECT data evaluation by using neural network

    International Nuclear Information System (INIS)

    Komatsu, H.; Matsumoto, Y.; Badics, Z.; Aoki, K.; Nakayasu, F.; Hashimoto, M.; Miya, K.

    1994-01-01

    At the in--service inspection of the steam generator (SG) tubings in Pressurized Water Reactor (PWR) plant, eddy current testing (ECT) has been widely used at each outage. At present, ECT data evaluation is mainly performed by ECT data analyst, therefore it has the following problems. Only ECT signal configuration on the impedance trajectory is used in the evaluation. It is an enormous time consuming process. The evaluation result is influenced by the ability and experience of the analyst. Especially, it is difficult to identify the true defect signal hidden in background signals such as lift--off noise and deposit signals. In this work, the authors performed the study on the possibility of the application of neural network to ECT data evaluation. It was demonstrated that the neural network proved to be effective to identify the nature of defect, by selecting several optimum input parameters to categorize the raw ECT signals

  19. Automatic Gap Detection in Friction Stir Welding Processes (Preprint)

    National Research Council Canada - National Science Library

    Yang, Yu; Kalya, Prabhanjana; Landers, Robert G; Krishnamurthy, K

    2006-01-01

    .... This paper develops a monitoring algorithm to detect gaps in Friction Stir Welding (FSW) processes. Experimental studies are conducted to determine how the process parameters and the gap width affect the welding process...

  20. A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data

    Science.gov (United States)

    Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.

    2017-12-01

    Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now

  1. Automatic Detection and Recognition of Pig Wasting Diseases Using Sound Data in Audio Surveillance Systems

    Directory of Open Access Journals (Sweden)

    Yongwha Chung

    2013-09-01

    Full Text Available Automatic detection of pig wasting diseases is an important issue in the management of group-housed pigs. Further, respiratory diseases are one of the main causes of mortality among pigs and loss of productivity in intensive pig farming. In this study, we propose an efficient data mining solution for the detection and recognition of pig wasting diseases using sound data in audio surveillance systems. In this method, we extract the Mel Frequency Cepstrum Coefficients (MFCC from sound data with an automatic pig sound acquisition process, and use a hierarchical two-level structure: the Support Vector Data Description (SVDD and the Sparse Representation Classifier (SRC as an early anomaly detector and a respiratory disease classifier, respectively. Our experimental results show that this new method can be used to detect pig wasting diseases both economically (even a cheap microphone can be used and accurately (94% detection and 91% classification accuracy, either as a standalone solution or to complement known methods to obtain a more accurate solution.

  2. Automatic decision support system based on SAR data for oil spill detection

    Science.gov (United States)

    Mera, David; Cotos, José M.; Varela-Pet, José; Rodríguez, Pablo G.; Caro, Andrés

    2014-11-01

    Global trade is mainly supported by maritime transport, which generates important pollution problems. Thus, effective surveillance and intervention means are necessary to ensure proper response to environmental emergencies. Synthetic Aperture Radar (SAR) has been established as a useful tool for detecting hydrocarbon spillages on the oceans surface. Several decision support systems have been based on this technology. This paper presents an automatic oil spill detection system based on SAR data which was developed on the basis of confirmed spillages and it was adapted to an important international shipping route off the Galician coast (northwest Iberian Peninsula). The system was supported by an adaptive segmentation process based on wind data as well as a shape oriented characterization algorithm. Moreover, two classifiers were developed and compared. Thus, image testing revealed up to 95.1% candidate labeling accuracy. Shared-memory parallel programming techniques were used to develop algorithms in order to improve above 25% of the system processing time.

  3. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  4. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    Science.gov (United States)

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data

  5. Automatic Determination of Fiber-Length Distribution in Composite Material Using 3D CT Data

    Science.gov (United States)

    Teßmann, Matthias; Mohr, Stephan; Gayetskyy, Svitlana; Haßler, Ulf; Hanke, Randolf; Greiner, Günther

    2010-12-01

    Determining fiber length distribution in fiber reinforced polymer components is a crucial step in quality assurance, since fiber length has a strong influence on overall strength, stiffness, and stability of the material. The approximate fiber length distribution is usually determined early in the development process, as conventional methods require a destruction of the sample component. In this paper, a novel, automatic, and nondestructive approach for the determination of fiber length distribution in fiber reinforced polymers is presented. For this purpose, high-resolution computed tomography is used as imaging method together with subsequent image analysis for evaluation. The image analysis consists of an iterative process where single fibers are detected automatically in each iteration step after having applied image enhancement algorithms. Subsequently, a model-based approach is used together with a priori information in order to guide a fiber tracing and segmentation process. Thereby, the length of the segmented fibers can be calculated and a length distribution can be deduced. The performance and the robustness of the segmentation method is demonstrated by applying it to artificially generated test data and selected real components.

  6. Automatic Determination of Fiber-Length Distribution in Composite Material Using 3D CT Data

    Directory of Open Access Journals (Sweden)

    Günther Greiner

    2010-01-01

    Full Text Available Determining fiber length distribution in fiber reinforced polymer components is a crucial step in quality assurance, since fiber length has a strong influence on overall strength, stiffness, and stability of the material. The approximate fiber length distribution is usually determined early in the development process, as conventional methods require a destruction of the sample component. In this paper, a novel, automatic, and nondestructive approach for the determination of fiber length distribution in fiber reinforced polymers is presented. For this purpose, high-resolution computed tomography is used as imaging method together with subsequent image analysis for evaluation. The image analysis consists of an iterative process where single fibers are detected automatically in each iteration step after having applied image enhancement algorithms. Subsequently, a model-based approach is used together with a priori information in order to guide a fiber tracing and segmentation process. Thereby, the length of the segmented fibers can be calculated and a length distribution can be deduced. The performance and the robustness of the segmentation method is demonstrated by applying it to artificially generated test data and selected real components.

  7. [Use of nondeclarative and automatic memory processes in motor learning: how to mitigate the effects of aging].

    Science.gov (United States)

    Chauvel, Guillaume; Maquestiaux, François; Didierjean, André; Joubert, Sven; Dieudonné, Bénédicte; Verny, Marc

    2011-12-01

    Does normal aging inexorably lead to diminished motor learning abilities? This article provides an overview of the literature on the question, with particular emphasis on the functional dissociation between two sets of memory processes: declarative, effortful processes, and non-declarative, automatic processes. There is abundant evidence suggesting that aging does impair learning when past memories of former actions are required (episodic memory) and recollected through controlled processing (working memory). However, other studies have shown that aging does not impair learning when motor actions are performed non verbally and automatically (tapping procedural memory). These findings led us to hypothesize that one can minimize the impact of aging on the ability to learn new motor actions by favouring procedural learning. Recent data validating this hypothesis are presented. Our findings underline the importance of developing new motor learning strategies, which "bypass" declarative, effortful memory processes.

  8. Automatic detection of micronuclei by cell microscopic image processing.

    Science.gov (United States)

    Bahreyni Toossi, Mohammad Taghi; Azimian, Hosein; Sarrafzadeh, Omid; Mohebbi, Shokoufeh; Soleymanifard, Shokouhozaman

    2017-12-01

    With the development and applications of ionizing radiation in medicine, the radiation effects on human health get more and more attention. Ionizing radiation can lead to various forms of cytogenetic damage, including increased frequencies of micronuclei (MNi) and chromosome abnormalities. The cytokinesis block micronucleus (CBMN) assay is widely used method for measuring MNi to determine chromosome mutations or genome instability in cultured human lymphocytes. The visual scoring of MNi is time-consuming and scorer fatigue can lead to inconsistency. In this work, we designed software for the scoring of in vitro CBMN assay for biomonitoring on Giemsa-stained slides that overcome many previous limitations. Automatic scoring proceeds in four stages as follows. First, overall segmentation of nuclei is done. Then, binucleated (BN) cells are detected. Next, the entire cell is estimated for each BN as it is assumed that there is no detectable cytoplasm. Finally, MNi are detected within each BN cell. The designed Software is even able to detect BN cells with vague cytoplasm and MNi in peripheral blood smear. Our system is tested on a self-provided dataset and is achieved high sensitivities of about 98% and 82% in recognizing BN cells and MNi, respectively. Moreover, in our study less than 1% false positives were observed that makes our system reliable for practical MNi scoring. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  10. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Directory of Open Access Journals (Sweden)

    A. Bellakaout

    2016-06-01

    Full Text Available Aerial topographic surveys using Light Detection and Ranging (LiDAR technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS, mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  11. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Science.gov (United States)

    Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.

    2016-06-01

    Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  12. Automatic rebuilding and optimization of crystallographic structures in the Protein Data Bank.

    Science.gov (United States)

    Joosten, Robbie P; Joosten, Krista; Cohen, Serge X; Vriend, Gert; Perrakis, Anastassis

    2011-12-15

    Macromolecular crystal structures in the Protein Data Bank (PDB) are a key source of structural insight into biological processes. These structures, some >30 years old, were constructed with methods of their era. With PDB_REDO, we aim to automatically optimize these structures to better fit their corresponding experimental data, passing the benefits of new methods in crystallography on to a wide base of non-crystallographer structure users. We developed new algorithms to allow automatic rebuilding and remodeling of main chain peptide bonds and side chains in crystallographic electron density maps, and incorporated these and further enhancements in the PDB_REDO procedure. Applying the updated PDB_REDO to the oldest, but also to some of the newest models in the PDB, corrects existing modeling errors and brings these models to a higher quality, as judged by standard validation methods. The PDB_REDO database and links to all software are available at http://www.cmbi.ru.nl/pdb_redo. r.joosten@nki.nl; a.perrakis@nki.nl Supplementary data are available at Bioinformatics online.

  13. Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR

    Science.gov (United States)

    Gao, Yang; Zhong, Ruofei; Tang, Tao; Wang, Liuzhao; Liu, Xianlin

    2017-08-01

    Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness (p) and completeness (r) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR.

  14. Automatic Creation of quality multi-word Lexica from noisy text data

    OpenAIRE

    Frontini, Francesca; Quochi, Valeria; Rubino, Francesco

    2012-01-01

    This paper describes the design of a tool for the automatic creation of multi-word lexica that is deployed as a web service and runs on automatically web-crawled data within the framework of the PANACEA platform. The main purpose of our task is to provide a (computationally "light") tool that creates a full high quality lexical resource of multi-word items. Within the platform, this tool is typically inserted in a work flow whose first step is automatic web-crawling. Therefore, the input data...

  15. A Hybrid Intelligent Search Algorithm for Automatic Test Data Generation

    Directory of Open Access Journals (Sweden)

    Ying Xing

    2015-01-01

    Full Text Available The increasing complexity of large-scale real-world programs necessitates the automation of software testing. As a basic problem in software testing, the automation of path-wise test data generation is especially important, which is in essence a constraint optimization problem solved by search strategies. Therefore, the constraint processing efficiency of the selected search algorithm is a key factor. Aiming at the increase of search efficiency, a hybrid intelligent algorithm is proposed to efficiently search the solution space of potential test data by making full use of both global and local search methods. Branch and bound is adopted for global search, which gives definite results with relatively less cost. In the search procedure for each variable, hill climbing is adopted for local search, which is enhanced with the initial values selected heuristically based on the monotonicity analysis of branching conditions. They are highly integrated by an efficient ordering method and the backtracking operation. In order to facilitate the search methods, the solution space is represented as state space. Experimental results show that the proposed method outperformed some other methods used in test data generation. The heuristic initial value selection strategy improves the search efficiency greatly and makes the search basically backtrack-free. The results also demonstrate that the proposed method is applicable in engineering.

  16. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    Bianchini, Ricardo M.; Estevez, Jorge; Vollmer, Alberto E.; Iglicki, Flora A.

    1999-01-01

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  17. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data.

    Science.gov (United States)

    2012-12-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  18. Automatic Data Filter Customization Using a Genetic Algorithm

    Science.gov (United States)

    Mandrake, Lukas

    2013-01-01

    This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.

  19. Profiling animal toxicants by automatically mining public bioassay data: a big data approach for computational toxicology.

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    Full Text Available In vitro bioassays have been developed and are currently being evaluated as potential alternatives to traditional animal toxicity models. Already, the progress of high throughput screening techniques has resulted in an enormous amount of publicly available bioassay data having been generated for a large collection of compounds. When a compound is tested using a collection of various bioassays, all the testing results can be considered as providing a unique bio-profile for this compound, which records the responses induced when the compound interacts with different cellular systems or biological targets. Profiling compounds of environmental or pharmaceutical interest using useful toxicity bioassay data is a promising method to study complex animal toxicity. In this study, we developed an automatic virtual profiling tool to evaluate potential animal toxicants. First, we automatically acquired all PubChem bioassay data for a set of 4,841 compounds with publicly available rat acute toxicity results. Next, we developed a scoring system to evaluate the relevance between these extracted bioassays and animal acute toxicity. Finally, the top ranked bioassays were selected to profile the compounds of interest. The resulting response profiles proved to be useful to prioritize untested compounds for their animal toxicity potentials and form a potential in vitro toxicity testing panel. The protocol developed in this study could be combined with structure-activity approaches and used to explore additional publicly available bioassay datasets for modeling a broader range of animal toxicities.

  20. Automatic data-driven real-time segmentation and recognition of surgical workflow.

    Science.gov (United States)

    Dergachyova, Olga; Bouget, David; Huaulmé, Arnaud; Morandi, Xavier; Jannin, Pierre

    2016-06-01

    With the intention of extending the perception and action of surgical staff inside the operating room, the medical community has expressed a growing interest towards context-aware systems. Requiring an accurate identification of the surgical workflow, such systems make use of data from a diverse set of available sensors. In this paper, we propose a fully data-driven and real-time method for segmentation and recognition of surgical phases using a combination of video data and instrument usage signals, exploiting no prior knowledge. We also introduce new validation metrics for assessment of workflow detection. The segmentation and recognition are based on a four-stage process. Firstly, during the learning time, a Surgical Process Model is automatically constructed from data annotations to guide the following process. Secondly, data samples are described using a combination of low-level visual cues and instrument information. Then, in the third stage, these descriptions are employed to train a set of AdaBoost classifiers capable of distinguishing one surgical phase from others. Finally, AdaBoost responses are used as input to a Hidden semi-Markov Model in order to obtain a final decision. On the MICCAI EndoVis challenge laparoscopic dataset we achieved a precision and a recall of 91 % in classification of 7 phases. Compared to the analysis based on one data type only, a combination of visual features and instrument signals allows better segmentation, reduction of the detection delay and discovery of the correct phase order.

  1. Automatic digital document processing and management problems, algorithms and techniques

    CERN Document Server

    Ferilli, Stefano

    2011-01-01

    This text reviews the issues involved in handling and processing digital documents. Examining the full range of a document's lifetime, this book covers acquisition, representation, security, pre-processing, layout analysis, understanding, analysis of single components, information extraction, filing, indexing and retrieval. This title: provides a list of acronyms and a glossary of technical terms; contains appendices covering key concepts in machine learning, and providing a case study on building an intelligent system for digital document and library management; discusses issues of security,

  2. Process Concepts for Semi-automatic Dismantling of LCD Televisions

    OpenAIRE

    Elo, Kristofer; Sundin, Erik

    2014-01-01

    There is a large variety of electrical and electronic equipment products, for example liquid crystal display television sets (LCD TVs), in the waste stream today. Many LCD TVs contain mercury, which is a challenge to treat at the recycling plants. Two current used processes to recycle LCD TVs are automated shredding and manual disassembly. This paper aims to present concepts for semi-automated dismantling processes for LCD TVs in order to achieve higher productivity and flexibility, and in tu...

  3. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    Science.gov (United States)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  4. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  5. Automatic parallelization of nested loop programs for non-manifest real-time stream processing applications

    NARCIS (Netherlands)

    Bijlsma, T.

    2011-01-01

    This thesis is concerned with the automatic parallelization of real-time stream processing applications, such that they can be executed on embedded multiprocessor systems. Stream processing applications can be found in the video and channel decoding domain. These applications often have temporal

  6. Automatic metaphor processing in adults with Asperger syndrome: a metaphor interference effect task.

    Science.gov (United States)

    Hermann, Ismene; Haser, Verena; van Elst, Ludger Tebartz; Ebert, Dieter; Müller-Feldmeth, Daniel; Riedel, Andreas; Konieczny, Lars

    2013-11-01

    This paper investigates automatic processing of novel metaphors in adults with Asperger Syndrome (AS) and typically developing controls. We present an experiment combining a semantic judgment task and a recognition task. Four types of sentences were compared: Literally true high-typical sentences, literally true low-typical sentences, apt metaphors, and scrambled metaphors (literally false sentences which are not readily interpretable as metaphors). Participants were asked to make rapid decisions about the literal truth of such sentences. The results revealed that AS and control participants showed significantly slower RTs for metaphors than for scrambled metaphors and made more mistakes in apt metaphoric sentences than in scrambled metaphors. At the same time, there was higher recognition of apt metaphors compared with scrambled metaphors. The findings indicate intact automatic metaphor processing in AS and replicate previous findings on automatic metaphor processing in typically developing individuals.

  7. Automatic processing of isotopic dilution curves obtained by precordial detection

    International Nuclear Information System (INIS)

    Verite, J.C.

    1973-01-01

    Dilution curves pose two distinct problems: that of their acquisition and that of their processing. A study devoted to the latter aspect only was presented. It was necessary to satisfy two important conditions: the treatment procedure, although applied to a single category of curves (isotopic dilution curves obtained by precordial detection), had to be as general as possible; to allow dissemination of the method the equipment used had to be relatively modest and inexpensive. A simple method, considering the curve processing as a process identification, was developed and should enable the mean heart cavity volume and certain pulmonary circulation parameters to be determined. Considerable difficulties were encountered, limiting the value of the results obtained though not condemning the method itself. The curve processing question raised the problem of their acquisition, i.e. the number of these curves and their meaning. A list of the difficulties encountered is followed by a set of possible solutions, a solution being understood to mean a curve processing combination where the overlapping between the two aspects of the problem is accounted for [fr

  8. Application of parallel processing for automatic inspection of printed circuits

    International Nuclear Information System (INIS)

    Lougheed, R.M.

    1986-01-01

    Automated visual inspection of printed electronic circuits is a challenging application for image processing systems. Detailed inspection requires high speed analysis of gray scale imagery along with high quality optics, lighting, and sensing equipment. A prototype system has been developed and demonstrated at the Environmental Research Institute of Michigan (ERIM) for inspection of multilayer thick-film circuits. The central problem of real-time image processing is solved by a special-purpose parallel processor which includes a new high-speed Cytocomputer. In this chapter the inspection process and the algorithms used are summarized, along with the functional requirements of the machine vision system. Next, the parallel processor is described in detail and then performance on this application is given

  9. Automatic Modelling of Rubble Mound Breakwaters from LIDAR Data

    Science.gov (United States)

    Bueno, M.; Díaz-Vilariño, L.; González-Jorge, H.; Martínez-Sánchez, J.; Arias, P.

    2015-08-01

    Rubble mound breakwaters maintenance is critical to the protection of beaches and ports. LiDAR systems provide accurate point clouds from the emerged part of the structure that can be modelled to make it more useful and easy to handle. This work introduces a methodology for the automatic modelling of breakwaters with armour units of cube shape. The algorithm is divided in three main steps: normal vector computation, plane segmentation, and cube reconstruction. Plane segmentation uses the normal orientation of the points and the edge length of the cube. Cube reconstruction uses the intersection of three perpendicular planes and the edge length. Three point clouds cropped from the main point cloud of the structure are used for the tests. The number of cubes detected is around 56 % for two of the point clouds and 32 % for the third one over the total physical cubes. Accuracy assessment is done by comparison with manually drawn cubes calculating the differences between the vertexes. It ranges between 6.4 cm and 15 cm. Computing time ranges between 578.5 s and 8018.2 s. The computing time increases with the number of cubes and the requirements of collision detection.

  10. Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework.

    Science.gov (United States)

    Zito, Tiziano; Wilbert, Niko; Wiskott, Laurenz; Berkes, Pietro

    2008-01-01

    Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  11. Modular toolkit for Data Processing (MDP: a Python data processing framework

    Directory of Open Access Journals (Sweden)

    Tiziano Zito

    2009-01-01

    Full Text Available Modular toolkit for Data Processing (MDP is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  12. SEMI-AUTOMATIC CO-REGISTRATION OF PHOTOGRAMMETRIC AND LIDAR DATA USING BUILDINGS

    Directory of Open Access Journals (Sweden)

    C. Armenakis

    2012-07-01

    Full Text Available In this work, the co-registration steps between LiDAR and photogrammetric DSM 3Ddata are analyzed and a solution based on automated plane matching is proposed and implemented. For a robust 3D geometric transformation both planes and points are used. Initially planes are chosen as the co-registration primitives. To confine the search space for the plane matching a sequential automatic building matching is performed first. For matching buildings from the LiDAR and the photogrammetric data, a similarity objective function is formed based on the roof height difference (RHD, the 3D histogram of the building attributes, and the building boundary area of a building. A region growing algorithm based on a Triangulated Irregular Network (TIN is implemented to extract planes from both datasets. Next, an automatic successive process for identifying and matching corresponding planes from the two datasets has been developed and implemented. It is based on the building boundary region and determines plane pairs through a robust matching process thus eliminating outlier pairs. The selected correct plane pairs are the input data for the geometric transformation process. The 3D conformal transformation method in conjunction with the attitude quaternion is applied to obtain the transformation parameters using the normal vectors of the corresponding plane pairs. Following the mapping of one dataset onto the coordinate system of the other, the Iterative Closest Point (ICP algorithm is then applied, using the corresponding building point clouds to further refine the transformation solution. The results indicate that the combination of planes and points improve the co-registration outcomes.

  13. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  14. Image processing tool for automatic feature recognition and quantification

    Science.gov (United States)

    Chen, Xing; Stoddard, Ryan J.

    2017-05-02

    A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.

  15. Automatic Optimization of Hardware Accelerators for Image Processing

    OpenAIRE

    Reiche, Oliver; Häublein, Konrad; Reichenbach, Marc; Hannig, Frank; Teich, Jürgen; Fey, Dietmar

    2015-01-01

    In the domain of image processing, often real-time constraints are required. In particular, in safety-critical applications, such as X-ray computed tomography in medical imaging or advanced driver assistance systems in the automotive domain, timing is of utmost importance. A common approach to maintain real-time capabilities of compute-intensive applications is to offload those computations to dedicated accelerator hardware, such as Field Programmable Gate Arrays (FPGAs). Programming such arc...

  16. automatic data collection design for neural networks detection

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. Automated data collection is necessary to alleviate problems inherent in data collection for investigation of management frauds. Once we have gathered a realistic data, several methods then exist for proper analysis and detection of anomalous transactions. However, in Nigeria, collecting fraudulent data is ...

  17. Automatic Data Collection Design for Neural Networks Detection of ...

    African Journals Online (AJOL)

    Automated data collection is necessary to alleviate problems inherent in data collection for investigation of management frauds. Once we have gathered a realistic data, several methods then exist for proper analysis and detection of anomalous transactions. However, in Nigeria, collecting fraudulent data is relatively difficult ...

  18. Automatic optimized discovery, creation and processing of astronomical catalogs

    NARCIS (Netherlands)

    Buddelmeijer, Hugo; Boxhoorn, Danny; Valentijn, Edwin A.

    We present the design of a novel way of handling astronomical catalogs in Astro-WISE in order to achieve the scalability required for the data produced by large scale surveys. A high level of automation and abstraction is achieved in order to facilitate interoperation with visualization software for

  19. An Automatic Image Processing Workflow for Daily Magnetic Resonance Imaging Quality Assurance.

    Science.gov (United States)

    Peltonen, Juha I; Mäkelä, Teemu; Sofiev, Alexey; Salli, Eero

    2017-04-01

    The performance of magnetic resonance imaging (MRI) equipment is typically monitored with a quality assurance (QA) program. The QA program includes various tests performed at regular intervals. Users may execute specific tests, e.g., daily, weekly, or monthly. The exact interval of these measurements varies according to the department policies, machine setup and usage, manufacturer's recommendations, and available resources. In our experience, a single image acquired before the first patient of the day offers a low effort and effective system check. When this daily QA check is repeated with identical imaging parameters and phantom setup, the data can be used to derive various time series of the scanner performance. However, daily QA with manual processing can quickly become laborious in a multi-scanner environment. Fully automated image analysis and results output can positively impact the QA process by decreasing reaction time, improving repeatability, and by offering novel performance evaluation methods. In this study, we have developed a daily MRI QA workflow that can measure multiple scanner performance parameters with minimal manual labor required. The daily QA system is built around a phantom image taken by the radiographers at the beginning of day. The image is acquired with a consistent phantom setup and standardized imaging parameters. Recorded parameters are processed into graphs available to everyone involved in the MRI QA process via a web-based interface. The presented automatic MRI QA system provides an efficient tool for following the short- and long-term stability of MRI scanners.

  20. Development and Utility of Automatic Language Processing Technologies. Volume 2

    Science.gov (United States)

    2014-04-01

    and understand ongoing situations, anticipate new situations that will require responses, and where possible influence the outcomes. Much of the...of musical notation ♫ and ♪ in the TED Talk data when a song is performed. For example: ♫ And that is all ♫ ♫ that love’s about ♫ ♫ And we’ll recall...so that concatenation will only apply when the specified language is Chinese, Japanese, or Korean . For example, the English sentence below contains

  1. Automatic concrete cracks detection and mapping of terrestrial laser scan data

    Directory of Open Access Journals (Sweden)

    Mostafa Rabah

    2013-12-01

    The current paper submits a method for automatic concrete cracks detection and mapping from the data that was obtained during laser scanning survey. The method of cracks detection and mapping is achieved by three steps, namely the step of shading correction in the original image, step of crack detection and finally step of crack mapping and processing steps. The detected crack is defined in a pixel coordinate system. To remap the crack into the referred coordinate system, a reverse engineering is used. This is achieved by a hybrid concept of terrestrial laser-scanner point clouds and the corresponding camera image, i.e. a conversion from the pixel coordinate system to the terrestrial laser-scanner or global coordinate system. The results of the experiment show that the mean differences between terrestrial laser scan and the total station are about 30.5, 16.4 and 14.3 mms in x, y and z direction, respectively.

  2. Automatic electricity markets data extraction for realistic multi-agent simulations

    DEFF Research Database (Denmark)

    Pereira, Ivo F.; Sousa, Tiago M.; Praca, Isabel

    2014-01-01

    This paper presents the development of a tool that provides a database with available information from real electricity markets, ensuring the required updating mechanisms. Some important characteristics of this tool are: capability of collecting, analyzing, processing and storing real electricity...... markets data available on-line; capability of dealing with different file formats and types, some of them inserted by the user, resulting from information obtained not on-line but based on the possible collaboration with market entities; definition and implementation of database gathering information from...... different market sources, even including different market types; machine learning approach for automatic definition of downloads periodicity of new information available on-line. This is a crucial tool to go a step forward in electricity markets simulation, since the integration of this database...

  3. Native language shapes automatic neural processing of speech.

    Science.gov (United States)

    Intartaglia, Bastien; White-Schwoch, Travis; Meunier, Christine; Roman, Stéphane; Kraus, Nina; Schön, Daniele

    2016-08-01

    The development of the phoneme inventory is driven by the acoustic-phonetic properties of one's native language. Neural representation of speech is known to be shaped by language experience, as indexed by cortical responses, and recent studies suggest that subcortical processing also exhibits this attunement to native language. However, most work to date has focused on the differences between tonal and non-tonal languages that use pitch variations to convey phonemic categories. The aim of this cross-language study is to determine whether subcortical encoding of speech sounds is sensitive to language experience by comparing native speakers of two non-tonal languages (French and English). We hypothesized that neural representations would be more robust and fine-grained for speech sounds that belong to the native phonemic inventory of the listener, and especially for the dimensions that are phonetically relevant to the listener such as high frequency components. We recorded neural responses of American English and French native speakers, listening to natural syllables of both languages. Results showed that, independently of the stimulus, American participants exhibited greater neural representation of the fundamental frequency compared to French participants, consistent with the importance of the fundamental frequency to convey stress patterns in English. Furthermore, participants showed more robust encoding and more precise spectral representations of the first formant when listening to the syllable of their native language as compared to non-native language. These results align with the hypothesis that language experience shapes sensory processing of speech and that this plasticity occurs as a function of what is meaningful to a listener. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Automatic Extraction of Road Surface and Curbstone Edges from Mobile Laser Scanning Data

    Science.gov (United States)

    Miraliakbari, A.; Hahn, M.; Sok, S.

    2015-05-01

    We present a procedure for automatic extraction of the road surface from geo-referenced mobile laser scanning data. The basic assumption of the procedure is that the road surface is smooth and limited by curbstones. Two variants of jump detection are investigated for detecting curbstone edges, one based on height differences the other one based on histograms of the height data. Region growing algorithms are proposed which use the irregular laser point cloud. Two- and four-neighbourhood growing strategies utilize the two height criteria for examining the neighborhood. Both height criteria rely on an assumption about the minimum height of a low curbstone. Road boundaries with lower or no jumps will not stop the region growing process. In contrast to this objects on the road can terminate the process. Therefore further processing such as bridging gaps between detected road boundary points and the removal of wrongly detected curbstone edges is necessary. Road boundaries are finally approximated by splines. Experiments are carried out with a ca. 2 km network of smalls streets located in the neighbourhood of University of Applied Sciences in Stuttgart. For accuracy assessment of the extracted road surfaces, ground truth measurements are digitized manually from the laser scanner data. For completeness and correctness of the region growing result values between 92% and 95% are achieved.

  5. Automatic image processing solutions for MRI-guided minimally invasive intervention planning

    NARCIS (Netherlands)

    Noorda, YH

    2016-01-01

    In this thesis, automatic image processing methods are discussed for the purpose of improving treatment planning of MRI-guided minimally invasive interventions. Specifically, the following topics are addressed: rib detection in MRI, liver motion modeling in MRI and MR-CT registration of planning

  6. The Development of Automatic and Controlled Inhibitory Retrieval Processes in True and False Recall

    Science.gov (United States)

    Knott, Lauren M.; Howe, Mark L.; Wimmer, Marina C.; Dewhurst, Stephen A.

    2011-01-01

    In three experiments, we investigated the role of automatic and controlled inhibitory retrieval processes in true and false memory development in children and adults. Experiment 1 incorporated a directed forgetting task to examine controlled retrieval inhibition. Experiments 2 and 3 used a part-set cue and retrieval practice task to examine…

  7. SYSTEM OF THE AUTOMATIC DOCUMENT PROCESSING IN SHELL OF BASE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Valeriy Yu. Bykov

    2010-09-01

    Full Text Available In the article the automatic system of the document processing, created with use of Microsoft Access – a component part of Microsoft Office is offered. This system can be used at school and Institutions of education with one computer or with system of 15 ones connecting in the local net.

  8. Children's and Adults' Automatic Processing of Proportion in a Stroop-Like Task

    Science.gov (United States)

    Yang, Ying; Hu, Qingfen; Wu, Di; Yang, Shuqi

    2015-01-01

    This current study examined human children's and adults' automatic processing of proportion using a Stroop-like paradigm. Preschool children and university students compared the areas of two sectors that varied not only in absolute areas but also in the proportions they occupied in their original rounds. A congruity effect was found in both age…

  9. Automatic Synthesis of Panoramic Radiographs from Dental Cone Beam Computed Tomography Data.

    Directory of Open Access Journals (Sweden)

    Ting Luo

    Full Text Available In this paper, we propose an automatic method of synthesizing panoramic radiographs from dental cone beam computed tomography (CBCT data for directly observing the whole dentition without the superimposition of other structures. This method consists of three major steps. First, the dental arch curve is generated from the maximum intensity projection (MIP of 3D CBCT data. Then, based on this curve, the long axial curves of the upper and lower teeth are extracted to create a 3D panoramic curved surface describing the whole dentition. Finally, the panoramic radiograph is synthesized by developing this 3D surface. Both open-bite shaped and closed-bite shaped dental CBCT datasets were applied in this study, and the resulting images were analyzed to evaluate the effectiveness of this method. With the proposed method, a single-slice panoramic radiograph can clearly and completely show the whole dentition without the blur and superimposition of other dental structures. Moreover, thickened panoramic radiographs can also be synthesized with increased slice thickness to show more features, such as the mandibular nerve canal. One feature of the proposed method is that it is automatically performed without human intervention. Another feature of the proposed method is that it requires thinner panoramic radiographs to show the whole dentition than those produced by other existing methods, which contributes to the clarity of the anatomical structures, including the enamel, dentine and pulp. In addition, this method can rapidly process common dental CBCT data. The speed and image quality of this method make it an attractive option for observing the whole dentition in a clinical setting.

  10. MOPITT Near Real-Time Data for LANCE: Automatic Quality Assurance and Comparison to Operational Products

    Science.gov (United States)

    Martinez-Alonso, S.; Deeter, M. N.; Worden, H. M.; Ziskin, D.

    2017-12-01

    Terra-MOPITT (the Measurements of Pollution in the Troposphere instrument) near real-time (NRT) carbon monoxide (CO) products have been selected for distribution through NASA's LANCE (the Land, Atmosphere Near Real-Time Capability for EOS). MOPITT version 7 NRT data will be made publicly available within 3 hours from observation. The retrieval process is the same for both MOPITT NRT and operational products, albeit for the former it is constrained to use ancillary data available within the latency time. Among other requirements, LANCE NRT products must be examined for quality assurance (QA) purposes and relative errors between NRT and operational products must be quantified. Here we present an algorithm for automatic MOPITT NRT QA aimed to identify artifacts and separate those from anomalously high but real CO values. The algorithm is based on a comparison to the statistics of MOPITT operational products. We discuss the algorithm's performance when tested by applying it to three MOPITT datasets: a known (and corrected) artifact in version 4 operational data, anomalously high CO values in operational data during the 2015 Indonesia fires, and actual NRT data. Last, we describe results from a quantitative comparison between MOPITT NRT data and their operational counterparts.

  11. Development of Automatic Visceral Fat Volume Calculation Software for CT Volume Data

    Directory of Open Access Journals (Sweden)

    Mitsutaka Nemoto

    2014-01-01

    Full Text Available Objective. To develop automatic visceral fat volume calculation software for computed tomography (CT volume data and to evaluate its feasibility. Methods. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30 in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Results. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. Conclusions. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  12. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    First and second order Rayleigh and Raman scatter is a common problem when fitting Parallel Factor Analysis (PARAFAC) to fluorescence excitation-emission data (EEM). The scatter does not contain any relevant chemical information and does not conform to the low-rank trilinear model. The scatter...

  14. Reduction to spark coordinates of data generated by automatic measurement of spark chamber film

    International Nuclear Information System (INIS)

    Maybury, R.; Hart, J.C.

    1976-09-01

    The initial stage in the data reduction for film from two spark chamber experiments is described. The film was automatically measured at the Rutherford Laboratory. The data from these measurements were reduced to a series of spark coordinates for each gap of the spark chambers. Quality control checks are discussed. (author)

  15. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    Science.gov (United States)

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  16. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  17. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  18. Automatic parallelization of nested loop programs for non-manifest real-time stream processing applications

    OpenAIRE

    Bijlsma, T.

    2011-01-01

    This thesis is concerned with the automatic parallelization of real-time stream processing applications, such that they can be executed on embedded multiprocessor systems. Stream processing applications can be found in the video and channel decoding domain. These applications often have temporal requirements and can contain non-manifest conditions and expressions. For non-manifest conditions and expressions the results cannot be evaluated at compile time. Current parallelization approaches ha...

  19. Parallelization and automatic data distribution for nuclear reactor simulations

    Energy Technology Data Exchange (ETDEWEB)

    Liebrock, L.M. [Liebrock-Hicks Research, Calumet, MI (United States)

    1997-07-01

    Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directly affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed.

  20. Parallelization and automatic data distribution for nuclear reactor simulations

    International Nuclear Information System (INIS)

    Liebrock, L.M.

    1997-01-01

    Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directly affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed

  1. Automatic car driving detection using raw accelerometry data.

    Science.gov (United States)

    Strączkiewicz, M; Urbanek, J K; Fadel, W F; Crainiceanu, C M; Harezlak, J

    2016-09-21

    Measuring physical activity using wearable devices has become increasingly popular. Raw data collected from such devices is usually summarized as 'activity counts', which combine information of human activity with environmental vibrations. Driving is a major sedentary activity that artificially increases the activity counts due to various car and body vibrations that are not connected to human movement. Thus, it has become increasingly important to identify periods of driving and quantify the bias induced by driving in activity counts. To address these problems, we propose a detection algorithm of driving via accelerometry (DADA), designed to detect time periods when an individual is driving a car. DADA is based on detection of vibrations generated by a moving vehicle and recorded by an accelerometer. The methodological approach is based on short-time Fourier transform (STFT) applied to the raw accelerometry data and identifies and focuses on frequency vibration ranges that are specific to car driving. We test the performance of DADA on data collected using wrist-worn ActiGraph devices in a controlled experiment conducted on 24 subjects. The median area under the receiver-operating characteristic curve (AUC) for predicting driving periods was 0.94, indicating an excellent performance of the algorithm. We also quantify the size of the bias induced by driving and obtain that per unit of time the activity counts generated by driving are, on average, 16% of the average activity counts generated during walking.

  2. Maritime over the Horizon Sensor Integration: High Frequency Surface-Wave-Radar and Automatic Identification System Data Integration Algorithm.

    Science.gov (United States)

    Nikolic, Dejan; Stojkovic, Nikola; Lekic, Nikola

    2018-04-09

    To obtain the complete operational picture of the maritime situation in the Exclusive Economic Zone (EEZ) which lies over the horizon (OTH) requires the integration of data obtained from various sensors. These sensors include: high frequency surface-wave-radar (HFSWR), satellite automatic identification system (SAIS) and land automatic identification system (LAIS). The algorithm proposed in this paper utilizes radar tracks obtained from the network of HFSWRs, which are already processed by a multi-target tracking algorithm and associates SAIS and LAIS data to the corresponding radar tracks, thus forming an integrated data pair. During the integration process, all HFSWR targets in the vicinity of AIS data are evaluated and the one which has the highest matching factor is used for data association. On the other hand, if there is multiple AIS data in the vicinity of a single HFSWR track, the algorithm still makes only one data pair which consists of AIS and HFSWR data with the highest mutual matching factor. During the design and testing, special attention is given to the latency of AIS data, which could be very high in the EEZs of developing countries. The algorithm is designed, implemented and tested in a real working environment. The testing environment is located in the Gulf of Guinea and includes a network of HFSWRs consisting of two HFSWRs, several coastal sites with LAIS receivers and SAIS data provided by provider of SAIS data.

  3. An algorithm for discovering Lagrangians automatically from data

    Directory of Open Access Journals (Sweden)

    Daniel J.A. Hills

    2015-11-01

    Full Text Available An activity fundamental to science is building mathematical models. These models are used to both predict the results of future experiments and gain insight into the structure of the system under study. We present an algorithm that automates the model building process in a scientifically principled way. The algorithm can take observed trajectories from a wide variety of mechanical systems and, without any other prior knowledge or tuning of parameters, predict the future evolution of the system. It does this by applying the principle of least action and searching for the simplest Lagrangian that describes the system’s behaviour. By generating this Lagrangian in a human interpretable form, it can also provide insight into the workings of the system.

  4. Low-cost automatic activity data recording system

    Directory of Open Access Journals (Sweden)

    Moraes M.F.D.

    1997-01-01

    Full Text Available We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT. Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds when the activity occurred. As a result, the computer builds up several files (one per detector/sensor containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.. Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

  5. Necessary Processing of Personal Data

    DEFF Research Database (Denmark)

    Tranberg, Charlotte Bagger

    2006-01-01

    The Data Protection Directive prohibits processing of sensitive data (racial or ethnic origin, political, religious or philosophical convictions, trade union membership and information on health and sex life). All other personal data may be processed, provided processing is deemed necessary...... Handelsgesellschaft. The aim of this article is to clarify the necessity requirement of the Data Protection Directive in terms of the general principle of proportionality. The usefulness of the principle of proportionality as the standard by which processing of personal data may be weighed is illustrated by the Peck...

  6. Preclinical Biokinetic Modelling of Tc-99m Radiophamaceuticals Obtained from Semi-Automatic Image Processing.

    Science.gov (United States)

    Cornejo-Aragón, Luz G; Santos-Cuevas, Clara L; Ocampo-García, Blanca E; Chairez-Oria, Isaac; Diaz-Nieto, Lorenza; García-Quiroz, Janice

    2017-01-01

    The aim of this study was to develop a semi automatic image processing algorithm (AIPA) based on the simultaneous information provided by X-ray and radioisotopic images to determine the biokinetic models of Tc-99m radiopharmaceuticals from quantification of image radiation activity in murine models. These radioisotopic images were obtained by a CCD (charge couple device) camera coupled to an ultrathin phosphorous screen in a preclinical multimodal imaging system (Xtreme, Bruker). The AIPA consisted of different image processing methods for background, scattering and attenuation correction on the activity quantification. A set of parametric identification algorithms was used to obtain the biokinetic models that characterize the interaction between different tissues and the radiopharmaceuticals considered in the study. The set of biokinetic models corresponded to the Tc-99m biodistribution observed in different ex vivo studies. This fact confirmed the contribution of the semi-automatic image processing technique developed in this study.

  7. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  8. Cognitive regulation of smoking behavior within a cigarette: Automatic and nonautomatic processes.

    Science.gov (United States)

    Motschman, Courtney A; Tiffany, Stephen T

    2016-06-01

    There has been limited research on cognitive processes governing smoking behavior in individuals who are tobacco dependent. In a replication (Baxter & Hinson, 2001) and extension, this study examined the theory (Tiffany, 1990) that drug use may be controlled by automatic processes that develop over repeated use. Heavy and occasional cigarette smokers completed a button-press reaction time (RT) task while concurrently smoking a cigarette, pretending to smoke a lit cigarette, or not smoking. Slowed RT during the button-press task indexed the cognitive disruption associated with nonautomatic control of behavior. Occasional smokers' RTs were slowed when smoking or pretending to smoke compared with when not smoking. Heavy smokers' RTs were slowed when pretending to smoke versus not smoking; however, their RTs were similarly fast when smoking compared with not smoking. The results indicated that smoking behavior was more highly regulated by controlled, nonautomatic processes among occasional smokers and by automatic processes among heavy smokers. Patterns of RT across the interpuff interval indicated that occasional smokers were significantly slowed in anticipation of and immediately after puffing onset, whereas heavy smokers were only slowed significantly after puffing onset. These findings suggest that the entirety of the smoking sequence becomes automatized, with the behaviors leading up to puffing becoming more strongly regulated by automatic processes with experience. These results have relevance to theories on the cognitive regulation of cigarette smoking and support the importance of interventions that focus on routinized behaviors that individuals engage in during and leading up to drug use. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Combining MEDLINE and publisher data to create parallel corpora for the automatic translation of biomedical text.

    Science.gov (United States)

    Yepes, Antonio Jimeno; Prieur-Gaston, Elise; Névéol, Aurélie

    2013-04-30

    Most of the institutional and research information in the biomedical domain is available in the form of English text. Even in countries where English is an official language, such as the United States, language can be a barrier for accessing biomedical information for non-native speakers. Recent progress in machine translation suggests that this technique could help make English texts accessible to speakers of other languages. However, the lack of adequate specialized corpora needed to train statistical models currently limits the quality of automatic translations in the biomedical domain. We show how a large-sized parallel corpus can automatically be obtained for the biomedical domain, using the MEDLINE database. The corpus generated in this work comprises article titles obtained from MEDLINE and abstract text automatically retrieved from journal websites, which substantially extends the corpora used in previous work. After assessing the quality of the corpus for two language pairs (English/French and English/Spanish) we use the Moses package to train a statistical machine translation model that outperforms previous models for automatic translation of biomedical text. We have built translation data sets in the biomedical domain that can easily be extended to other languages available in MEDLINE. These sets can successfully be applied to train statistical machine translation models. While further progress should be made by incorporating out-of-domain corpora and domain-specific lexicons, we believe that this work improves the automatic translation of biomedical texts.

  10. Automatic detection of alpine rockslides in continuous seismic data using hidden Markov models

    Science.gov (United States)

    Dammeier, Franziska; Moore, Jeffrey R.; Hammer, Conny; Haslinger, Florian; Loew, Simon

    2016-02-01

    Data from continuously recording permanent seismic networks can contain information about rockslide occurrence and timing complementary to eyewitness observations and thus aid in construction of robust event catalogs. However, detecting infrequent rockslide signals within large volumes of continuous seismic waveform data remains challenging and often requires demanding manual intervention. We adapted an automatic classification method using hidden Markov models to detect rockslide signals in seismic data from two stations in central Switzerland. We first processed 21 known rockslides, with event volumes spanning 3 orders of magnitude and station event distances varying by 1 order of magnitude, which resulted in 13 and 19 successfully classified events at the two stations. Retraining the models to incorporate seismic noise from the day of the event improved the respective results to 16 and 19 successful classifications. The missed events generally had low signal-to-noise ratio and small to medium volumes. We then processed nearly 14 years of continuous seismic data from the same two stations to detect previously unknown events. After postprocessing, we classified 30 new events as rockslides, of which we could verify three through independent observation. In particular, the largest new event, with estimated volume of 500,000 m3, was not generally known within the Swiss landslide community, highlighting the importance of regional seismic data analysis even in densely populated mountainous regions. Our method can be easily implemented as part of existing earthquake monitoring systems, and with an average event detection rate of about two per month, manual verification would not significantly increase operational workload.

  11. Evaluation of missing data techniques for in-car automatic speech recognition

    OpenAIRE

    Wang, Y.; Vuerinckx, R.; Gemmeke, J.F.; Cranen, B.; Hamme, H. Van

    2009-01-01

    Wang Y., Vuerinckx R., Gemmeke J., Cranen B., Van hamme H., ''Evaluation of missing data techniques for in-car automatic speech recognition'', Proceedings NAG/DAGA 2009 - international conference on acoustics, 4 pp., March 23-26, 2009, Rotterdam, The Netherlands.

  12. Energy balance of a glacier surface: analysis of Automatic Weather Station data from the Morteratschgletscher, Switzerland

    NARCIS (Netherlands)

    Oerlemans, J.; Klok, E.J.

    2002-01-01

    We describe and analyze a complete 1-yr data set from an automatic weather station (AWS) located on the snout of the Morteratschgletscher, Switzerland. The AWS stands freely on the glacier surface and measures pressure, windspeed, wind direction, air temperature and humidity, incoming and

  13. Automatic data acquisition system of environmental radiation monitor with a personal computer

    International Nuclear Information System (INIS)

    Ohkubo, Tohru; Nakamura, Takashi.

    1984-05-01

    The automatic data acquisition system of environmental radiation monitor was developed in a low price by using a PET personal computer. The count pulses from eight monitors settled at four site boundaries were transmitted to a radiation control room by a signal transmission device and analyzed by the computer via 12 channel scaler and PET-CAMAC Interface for graphic display and printing. (author)

  14. Manual editing of automatically recorded data in an anesthesia information management system.

    Science.gov (United States)

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  15. Semi-Automatic Detection of Indigenous Settlement Features on Hispaniola through Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Till F. Sonnemann

    2017-12-01

    Full Text Available Satellite imagery has had limited application in the analysis of pre-colonial settlement archaeology in the Caribbean; visible evidence of wooden structures perishes quickly in tropical climates. Only slight topographic modifications remain, typically associated with middens. Nonetheless, surface scatters, as well as the soil characteristics they produce, can serve as quantifiable indicators of an archaeological site, detectable by analyzing remote sensing imagery. A variety of pre-processed, very diverse data sets went through a process of image registration, with the intention to combine multispectral bands to feed two different semi-automatic direct detection algorithms: a posterior probability, and a frequentist approach. Two 5 × 5 km2 areas in the northwestern Dominican Republic with diverse environments, having sufficient imagery coverage, and a representative number of known indigenous site locations, served each for one approach. Buffers around the locations of known sites, as well as areas with no likely archaeological evidence were used as samples. The resulting maps offer quantifiable statistical outcomes of locations with similar pixel value combinations as the identified sites, indicating higher probability of archaeological evidence. These still very experimental and rather unvalidated trials, as they have not been subsequently groundtruthed, show variable potential of this method in diverse environments.

  16. Choice architectural nudge interventions to promote vegetable consumption based on automatic processes decision-making

    DEFF Research Database (Denmark)

    Skov, Laurits Rohden; Friis Rasmussen, Rasmus; Møller Andersen, Pernille

    2014-01-01

    Objective: To test the effectiveness of three types of choice architectural nudges to promote vegetable consumption among Danish people. The experiment aims at providing evidence on the influence of automatic processing system in the food choice situation in an all you can eat buffet serving. Met......, but promoted health by decreasing total energy intake which suggests that visual variety of fruit and greens prompts a healthy-eater subconscious behaviour....

  17. Automatic Ability Attribution after Failure: A Dual Process View of Achievement Attribution

    OpenAIRE

    Sakaki, Michiko; Murayama, Kou

    2013-01-01

    Causal attribution has been one of the most influential frameworks in the literature of achievement motivation, but previous studies considered achievement attribution as relatively deliberate and effortful processes. In the current study, we tested the hypothesis that people automatically attribute their achievement failure to their ability, but reduce the ability attribution in a controlled manner. To address this hypothesis, we measured participants’ causal attribution belief for their tas...

  18. Digital Data Processing of Images

    African Journals Online (AJOL)

    Digital data processing was investigated to perform image processing. Image smoothing and restoration were explored and promising results obtained. The use of the computer, not only as a data management device, but as an important tool to render quantitative information, was illustrated by lung function determination.

  19. Multibeam sonar backscatter data processing

    Science.gov (United States)

    Schimel, Alexandre C. G.; Beaudoin, Jonathan; Parnum, Iain M.; Le Bas, Tim; Schmidt, Val; Keith, Gordon; Ierodiaconou, Daniel

    2018-01-01

    Multibeam sonar systems now routinely record seafloor backscatter data, which are processed into backscatter mosaics and angular responses, both of which can assist in identifying seafloor types and morphology. Those data products are obtained from the multibeam sonar raw data files through a sequence of data processing stages that follows a basic plan, but the implementation of which varies greatly between sonar systems and software. In this article, we provide a comprehensive review of this backscatter data processing chain, with a focus on the variability in the possible implementation of each processing stage. Our objective for undertaking this task is twofold: (1) to provide an overview of backscatter data processing for the consideration of the general user and (2) to provide suggestions to multibeam sonar manufacturers, software providers and the operators of these systems and software for eventually reducing the lack of control, uncertainty and variability associated with current data processing implementations and the resulting backscatter data products. One such suggestion is the adoption of a nomenclature for increasingly refined levels of processing, akin to the nomenclature adopted for satellite remote-sensing data deliverables.

  20. Graphic data output system for the automatic control systems of heavy ion accelerator

    International Nuclear Information System (INIS)

    Angelov, A.Kh.; Inkin, V.D.; Lysyakov, V.N.

    1978-01-01

    The principles and specific solutions underlying the task of expanding the automatic control system of a heavy ion accelerator (HIA ACS) are described. The HIA ACS structure is based on the possibility of switching a second TPA/1 computer into the system. The HIA ACS is realized in the form of two coupled systems with their functional specializations: system for data acquisition and processing (system A) and system for data displaying (system B). System B permits to considerably increase the degree of controllability of the entire ACS by an operator. Basing on the obtained information, the operator can preset a controlling effect into system A. A large set of external devices permits the operator to perform single measurements of the value of a selected parameter, set new values for the selected parameter, plot functional dependences like Z=F(x), Z=F(x,y). A raster-type alphabet-digital and memory display, a teletype and a plotter serve for visual image of information. An indicator-setting terminal is simultaneously an indicator and control unit

  1. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    Energy Technology Data Exchange (ETDEWEB)

    Vega, J., E-mail: jesus.vega@ciemat.e [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion. Avda. Complutense, 22, 28040 Madrid (Spain); Murari, A. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); Ratta, G.A.; Gonzalez, S. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion. Avda. Complutense, 22, 28040 Madrid (Spain); Dormido-Canto, S. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Dpto. Informatica y Automatica, UNED, Madrid (Spain)

    2010-07-15

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  2. Automatic Classification of Extensive Aftershock Sequences Using Empirical Matched Field Processing

    Science.gov (United States)

    Gibbons, Steven J.; Harris, David B.; Kværna, Tormod; Dodge, Douglas A.

    2013-04-01

    The aftershock sequences that follow large earthquakes create considerable problems for data centers attempting to produce comprehensive event bulletins in near real-time. The greatly increased number of events which require processing can overwhelm analyst resources and reduce the capacity for analyzing events of monitoring interest. This exacerbates a potentially reduced detection capability at key stations, due the noise generated by the sequence, and a deterioration in the quality of the fully automatic preliminary event bulletins caused by the difficulty in associating the vast numbers of closely spaced arrivals over the network. Considerable success has been enjoyed by waveform correlation methods for the automatic identification of groups of events belonging to the same geographical source region, facilitating the more time-efficient analysis of event ensembles as opposed to individual events. There are, however, formidable challenges associated with the automation of correlation procedures. The signal generated by a very large earthquake seldom correlates well enough with the signals generated by far smaller aftershocks for a correlation detector to produce statistically significant triggers at the correct times. Correlation between events within clusters of aftershocks is significantly better, although the issues of when and how to initiate new pattern detectors are still being investigated. Empirical Matched Field Processing (EMFP) is a highly promising method for detecting event waveforms suitable as templates for correlation detectors. EMFP is a quasi-frequency-domain technique that calibrates the spatial structure of a wavefront crossing a seismic array in a collection of narrow frequency bands. The amplitude and phase weights that result are applied in a frequency-domain beamforming operation that compensates for scattering and refraction effects not properly modeled by plane-wave beams. It has been demonstrated to outperform waveform correlation as a

  3. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  4. NMRFx Processor: a cross-platform NMR data processing program

    International Nuclear Information System (INIS)

    Norris, Michael; Fetler, Bayard; Marchant, Jan; Johnson, Bruce A.

    2016-01-01

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  5. NMRFx Processor: a cross-platform NMR data processing program.

    Science.gov (United States)

    Norris, Michael; Fetler, Bayard; Marchant, Jan; Johnson, Bruce A

    2016-08-01

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  6. NMRFx Processor: a cross-platform NMR data processing program

    Energy Technology Data Exchange (ETDEWEB)

    Norris, Michael; Fetler, Bayard [One Moon Scientific, Inc. (United States); Marchant, Jan [University of Maryland Baltimore County, Howard Hughes Medical Institute (United States); Johnson, Bruce A., E-mail: bruce.johnson@asrc.cuny.edu [One Moon Scientific, Inc. (United States)

    2016-08-15

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  7. Automatic identification of non-reflective subsurface targets in radar sounder data based on morphological profile

    Science.gov (United States)

    Khodadadzadeh, Mahdi; Ilisei, Ana-Maria; Bruzzone, Lorenzo

    2017-10-01

    The amount of radar sounder data, which are used to analyze the subsurface of icy environments (e.g., Poles of Earth and Mars), is dramatically increasing from both airborne campaigns at the ice sheets and satellite missions on other planetary bodies. However, the main approach to the investigation of such data is by visual interpretation, which is subjective and time consuming. Moreover, the few available automatic techniques have been developed for analyzing highly reflective subsurface targets, e.g., ice layers, basal interface. Besides the high reflective targets, glaciologists have also shown great interest in the analysis of non-reflective targets, such as the echo-free zone in ice sheets, and the reflective free zone in the subsurface of the South Pole of Mars. However, in the literature, there is no dedicated automatic technique for the analysis of non-reflective targets. To address this limitation, we propose an automatic classification technique for the identification of non-reflective targets in radar sounder data. The method is made up of two steps, i.e., i) feature extraction, which is the core of the method, and ii) automatic classification of subsurface targets. We initially prove that the commonly employed features for the analysis of the radar signal (e.g., statistical and texture based features) are ineffective for the identification of non-reflective targets. Thus, for feature extraction, we propose to exploit structural information based on the morphological closing profile. We show the effectiveness of such features in discriminating of non-reflective target from the other ice subsurface targets. In the second step, a random forest classifier is used to perform the automatic classification. Our experimental results, conducted using two data sets from Central Antarctica and South Pole of Mars, point out the effectiveness of the proposed technique for the accurate identification of non-reflective targets.

  8. Graphical Language for Data Processing

    Science.gov (United States)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  9. Automatic extraction of road features in urban environments using dense ALS data

    Science.gov (United States)

    Soilán, Mario; Truong-Hong, Linh; Riveiro, Belén; Laefer, Debra

    2018-02-01

    This paper describes a methodology that automatically extracts semantic information from urban ALS data for urban parameterization and road network definition. First, building façades are segmented from the ground surface by combining knowledge-based information with both voxel and raster data. Next, heuristic rules and unsupervised learning are applied to the ground surface data to distinguish sidewalk and pavement points as a means for curb detection. Then radiometric information was employed for road marking extraction. Using high-density ALS data from Dublin, Ireland, this fully automatic workflow was able to generate a F-score close to 95% for pavement and sidewalk identification with a resolution of 20 cm and better than 80% for road marking detection.

  10. Wave data processing toolbox manual

    Science.gov (United States)

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata

  11. Process Mining Online Assessment Data

    Science.gov (United States)

    Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul

    2009-01-01

    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…

  12. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    Science.gov (United States)

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-12-01

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  13. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  14. Automatic calculation of tree diameter from stereoscopic image pairs using digital image processing.

    Science.gov (United States)

    Yi, Faliu; Moon, Inkyu

    2012-06-20

    Automatic operations play an important role in societies by saving time and improving efficiency. In this paper, we apply the digital image processing method to the field of lumbering to automatically calculate tree diameters in order to reduce culler work and enable a third party to verify tree diameters. To calculate the cross-sectional diameter of a tree, the image was first segmented by the marker-controlled watershed transform algorithm based on the hue saturation intensity (HSI) color model. Then, the tree diameter was obtained by measuring the area of every isolated region in the segmented image. Finally, the true diameter was calculated by multiplying the diameter computed in the image and the scale, which was derived from the baseline and disparity of correspondence points from stereoscopic image pairs captured by rectified configuration cameras.

  15. The automatic conservative: ideology-based attentional asymmetries in the processing of valenced information.

    Directory of Open Access Journals (Sweden)

    Luciana Carraro

    Full Text Available Research has widely explored the differences between conservatives and liberals, and it has been also recently demonstrated that conservatives display different reactions toward valenced stimuli. However, previous studies have not yet fully illuminated the cognitive underpinnings of these differences. In the current work, we argued that political ideology is related to selective attention processes, so that negative stimuli are more likely to automatically grab the attention of conservatives as compared to liberals. In Experiment 1, we demonstrated that negative (vs. positive information impaired the performance of conservatives, more than liberals, in an Emotional Stroop Task. This finding was confirmed in Experiment 2 and in Experiment 3 employing a Dot-Probe Task, demonstrating that threatening stimuli were more likely to attract the attention of conservatives. Overall, results support the conclusion that people embracing conservative views of the world display an automatic selective attention for negative stimuli.

  16. ACTIV - a program for automatic processing of gamma-ray spectra

    International Nuclear Information System (INIS)

    Zlokazov, V.B.

    1982-01-01

    Program ACTIV is intended for precise analysis of γ-rays and X-ray spectra and allows the user to carry out the full cycle of automatic processing of a series of spectra, i.e. calibration, automatic peak search, determination of peak positions and areas, identification of the radioisotopes and the transformation of the areas found into masses of isotopes in the irradiated sample. ACTIV uses a complex mathematical technique and is oriented mainly to large computers, but using overlaid loading, it can be run also on small computers like the PDP 11/70. Compared with other similar programs, ACTIV has some advantages in accuracy of peak shape description and in the reliability of the peak search and its least-square analysis. The program can be used for the purpose of activation analysis. The program can analyze spectra with poor statistics and with broad and narrow peaks. (orig.)

  17. VACTIV: A graphical dialog based program for an automatic processing of line and band spectra

    Science.gov (United States)

    Zlokazov, V. B.

    2013-05-01

    and estimation of parameters of interest. VACTIV can run on any standard modern laptop. Reasons for the new version: At the time of its creation (1999) VACTIV was seemingly the first attempt to apply the newest programming languages and styles to systems of spectrum analysis. Its goal was to both get a convenient and efficient technique for data processing, and to elaborate the formalism of spectrum analysis in terms of classes, their properties, their methods and events of an object-oriented programming language. Summary of revisions: Compared with ACTIV, VACTIV preserves all the mathematical algorithms, but provides the user with all the benefits of an interface, based on a graphical dialog. It allows him to make a quick intervention in the work of the program; in particular, to carry out the on-line control of the fitting process: depending on the intermediate results and using the visual form of data representation, to change the conditions for the fitting and so achieve the optimum performance, selecting the optimum strategy. To find the best conditions for the fitting one can compress the spectrum, delete the blunders from it, smooth it using a high-frequency spline filter and build the background using a low-frequency spline filter; use not only automatic methods for the blunder deletion, the peak search, the peak model forming and the calibration, but also use manual mouse clicking on the spectrum graph. Restrictions: To enhance the reliability and portability of the program the majority of the most important arrays have a static allocation; all the arrays are allocated with a surplus, and the total pool of the program is restricted only by the size of the computer virtual memory. A spectrum has the static size of 32 K real words. The maximum size of the least-square matrix is 314 (the maximum number of fitted parameters per one analyzed spectrum interval, not for the whole spectrum), from which it follows that the maximum number of peaks in one spectrum

  18. Study of an automatic dosing of neptunium in the industrial process of separation neptunium 237-plutonium 238

    International Nuclear Information System (INIS)

    Ros, Pierre

    1973-01-01

    The objective is to study and to adapt a method of automatic dosing of neptunium to the industrial process of separation and purification of plutonium 238, while taking the information quality and economic aspects into account. After a recall of some generalities on the production of plutonium 238, and the process of separation plutonium-neptunium, the author addresses the dosing of neptunium. The adopted measurement technique is spectrophotometry (of neptunium, of neptunium peroxide) which is the most flexible and economic to adapt to automatic control. The author proposes a project of chemical automatic machine, and discusses the complex (stoichiometry, form) and some aspects of neptunium dosing (redox reactions, process control) [fr

  19. Automatic segmentation of the wire frame of stent grafts from CT data.

    Science.gov (United States)

    Klein, Almar; van der Vliet, J Adam; Oostveen, Luuk J; Hoogeveen, Yvonne; Kool, Leo J Schultze; Renema, W Klaas Jan; Slump, Cornelis H

    2012-01-01

    Endovascular aortic replacement (EVAR) is an established technique, which uses stent grafts to treat aortic aneurysms in patients at risk of aneurysm rupture. Late stent graft failure is a serious complication in endovascular repair of aortic aneurysms. Better understanding of the motion characteristics of stent grafts will be beneficial for designing future devices. In addition, analysis of stent graft movement in individual patients in vivo can be valuable for predicting stent graft failure in these patients. To be able to gather information on stent graft motion in a quick and robust fashion, we propose an automatic method to segment stent grafts from CT data, consisting of three steps: the detection of seed points, finding the connections between these points to produce a graph, and graph processing to obtain the final geometric model in the form of an undirected graph. Using annotated reference data, the method was optimized and its accuracy was evaluated. The experiments were performed using data containing the AneuRx and Zenith stent grafts. The algorithm is robust for noise and small variations in the used parameter values, does not require much memory according to modern standards, and is fast enough to be used in a clinical setting (65 and 30s for the two stent types, respectively). Further, it is shown that the resulting graphs have a 95% (AneuRx) and 92% (Zenith) correspondence with the annotated data. The geometric model produced by the algorithm allows incorporation of high level information and material properties. This enables us to study the in vivo motions and forces that act on the frame of the stent. We believe that such studies will provide new insights into the behavior of the stent graft in vivo, enables the detection and prediction of stent failure in individual patients, and can help in designing better stent grafts in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. High School Library Data Processing

    Directory of Open Access Journals (Sweden)

    Betty Flora

    1969-03-01

    Full Text Available Planning and operation of an automated high school library system is described which utilizes an IBM 1401 data processing system installed for teaching purposes. Book orfering, shelf listing and circulation have been computerized.

  1. Standardization of GPS data processing

    International Nuclear Information System (INIS)

    Park, Pil Ho

    2001-06-01

    A nationwide GPS network has been constructed with about 60 permanent GPS stations after late 1990s in Korea. For using the GPS in variety of application area like crustal deformation, positioning, or monitoring upper atmosphere, it is necessary to have ability to process the data precisely. Now Korea Astronomy Observatory has the precise GPS data processing technique in Korea because it is difficult to understand characteristics of the parameters we want to estimate, resolve the integer ambiguity, and analyze many errors. There are three reliable GPS data processing software in the world ; Bernese(University of Berne), GIPSY-OASIS(JPL), GAMIT(MIT). These software allow us to achieve millimeter accuracy in the horizontal position and about 1 cm accuracy vertically even for regional networks with a diameter of several thousand kilometers. But we established the standard of GPS data processing using Bernese as main tool and GIPSY O ASIS as side

  2. Gigabit Per Second Data Processing

    Data.gov (United States)

    National Aeronautics and Space Administration — Solve the existing problem of handling the on-board, real time, memory intensive processing of the Gb/s data stream of the scientific instrument. Examine and define...

  3. Automatic registration of imaging mass spectrometry data to the Allen Brain Atlas transcriptome

    Science.gov (United States)

    Abdelmoula, Walid M.; Carreira, Ricardo J.; Shyti, Reinald; Balluff, Benjamin; Tolner, Else; van den Maagdenberg, Arn M. J. M.; Lelieveldt, B. P. F.; McDonnell, Liam; Dijkstra, Jouke

    2014-03-01

    Imaging Mass Spectrometry (IMS) is an emerging molecular imaging technology that provides spatially resolved information on biomolecular structures; each image pixel effectively represents a molecular mass spectrum. By combining the histological images and IMS-images, neuroanatomical structures can be distinguished based on their biomolecular features as opposed to morphological features. The combination of IMS data with spatially resolved gene expression maps of the mouse brain, as provided by the Allen Mouse Brain atlas, would enable comparative studies of spatial metabolic and gene expression patterns in life-sciences research and biomarker discovery. As such, it would be highly desirable to spatially register IMS slices to the Allen Brain Atlas (ABA). In this paper, we propose a multi-step automatic registration pipeline to register ABA histology to IMS- images. Key novelty of the method is the selection of the best reference section from the ABA, based on pre-processed histology sections. First, we extracted a hippocampus-specific geometrical feature from the given experimental histological section to initially localize it among the ABA sections. Then, feature-based linear registration is applied to the initially localized section and its two neighbors in the ABA to select the most similar reference section. A non-rigid registration yields a one-to-one mapping of the experimental IMS slice to the ABA. The pipeline was applied on 6 coronal sections from two mouse brains, showing high anatomical correspondence, demonstrating the feasibility of complementing biomolecule distributions from individual mice with the genome-wide ABA transcriptome.

  4. Automatic extraction of faults and fractal analysis from remote sensing data

    Directory of Open Access Journals (Sweden)

    R. Gloaguen

    2007-01-01

    Full Text Available Object-based classification is a promising technique for image classification. Unlike pixel-based methods, which only use the measured radiometric values, the object-based techniques can also use shape and context information of scene textures. These extra degrees of freedom provided by the objects allow the automatic identification of geological structures. In this article, we present an evaluation of object-based classification in the context of extraction of geological faults. Digital elevation models and radar data of an area near Lake Magadi (Kenya have been processed. We then determine the statistics of the fault populations. The fractal dimensions of fault dimensions are similar to fractal dimensions directly measured on remote sensing images of the study area using power spectra (PSD and variograms. These methods allow unbiased statistics of faults and help us to understand the evolution of the fault systems in extensional domains. Furthermore, the direct analysis of image texture is a good indicator of the fault statistics and allows us to classify the intensity and type of deformation. We propose that extensional fault networks can be modeled by iterative function system (IFS.

  5. A Method of Generating Indoor Map Spatial Data Automatically from Architectural Plans

    Directory of Open Access Journals (Sweden)

    SUN Weixin

    2016-06-01

    Full Text Available Taking architectural plans as data source, we proposed a method which can automatically generate indoor map spatial data. Firstly, referring to the spatial data demands of indoor map, we analyzed the basic characteristics of architectural plans, and introduced concepts of wall segment, adjoining node and adjoining wall segment, based on which basic flow of indoor map spatial data automatic generation was further established. Then, according to the adjoining relation between wall lines at the intersection with column, we constructed a repair method for wall connectivity in relation to the column. Utilizing the method of gradual expansibility and graphic reasoning to judge wall symbol local feature type at both sides of door or window, through update the enclosing rectangle of door or window, we developed a repair method for wall connectivity in relation to the door or window and a method for transform door or window into indoor map point feature. Finally, on the basis of geometric relation between adjoining wall segment median lines, a wall center-line extraction algorithm was presented. Taking one exhibition hall's architectural plan as example, we performed experiment and results show that the proposed methods have preferable applicability to deal with various complex situations, and realized indoor map spatial data automatic extraction effectively.

  6. Processing of soil survey data

    NARCIS (Netherlands)

    Bregt, A.K.

    1992-01-01

    This thesis focuses on processing soil survey data into user-specific information. Within this process four steps are distinguished: collection, storage, analysis and presentation. A review of each step is given, and detailed research on important aspects of the steps are

  7. Automatic Generation and Validation of an ITER Neutronics Model from CAD Data

    International Nuclear Information System (INIS)

    Tsige-Tamirat, H.; Fischer, U.; Serikov, A.; Stickel, S.

    2006-01-01

    Quality assurance rules request the consistency of the geometry model used in neutronics Monte Carlo calculations and the underlying engineering CAD model. This can be ensured by automatically converting the CAD geometry data into the representation used by Monte Carlo codes such as MCNP. Suitable conversion algorithms have been previously developed at FZK and were implemented into an interface program. This paper describes the application of the interface program to a CAD model of a 40 degree ITER torus sector for the generation of a neutronics geometry model for MCNP. A CAD model provided by ITER consisting of all significant components was analyzed, pre-processed, and converted into MCNP geometry representation. The analysis and pre-processing steps include the checking of the adequacy of the CAD model for neutronics calculations in terms of geometric representation and complexity, and of corresponding corrections. This step is followed by the conversion of the CAD model into MCNP geometry including error detection and correction as well as the completion of the model by voids. The conversion process does not introduce any approximations so that the resulting MCNP geometry is fully equivalent to the original CAD geometry. However, there is a moderate increase of the complexity measured in terms of the number of cell and surfaces. The validity of the converted geometry model was shown by comparing the results of stochastic MCNP volume calculations and the volumes provided by the CAD kernel of the interface programme. Furthermore, successful MCNP test calculations have been performed for verifying the converted ITER model in application calculations. (author)

  8. [Perspectives of an electronic data processing-controlled anesthesia protocol].

    Science.gov (United States)

    Martens, G; Naujoks, B

    1987-10-01

    There are two ways to introduce electronic data processing in anesthesia recording, which should be combined in the future: (1) computer-aided data collection (during anesthesia) and (2) data analysis. Both procedures have their own advantages and disadvantages. The first step in data collection is a system whereby the on-line registered data are automatically plotted and the discrete data are noted by hand (semi-automatic recording). The second step is to keep the minutes on a display screen instead of on paper, thus producing a protocol in digital form (automatic recording). We discuss the problems of these computer-aided recording systems and future trends, in particular the problems caused by the "human-computer interface" and by uncertainty with respect to the validity of the stored data. For computer-aided data analysis of anesthesia records, one has to select appropriate data in order to build up data bases. This selection is necessary whether the protocol is in analogical or in digital form, and we attempt to develop some general rules, the concrete selection depends, of course, on the aim of the evaluation. As an example we discuss evaluations for administrative purposes. Evaluations for scientific questions are even more affected by the quality of data definitions, and the efforts involved in data management are considerably higher. At the end of this paper we sketch a hybrid information system for computer-aided anesthesia recording that combines data collection and data analysis.

  9. BRICORK: an automatic machine with image processing for the production of corks

    Science.gov (United States)

    Davies, Roger; Correia, Bento A. B.; Carvalho, Fernando D.; Rodrigues, Fernando C.

    1991-06-01

    The production of cork stoppers from raw cork strip is a manual and labour-intensive process in which a punch-operator quickly inspects all sides of the cork strip for defects and decides where to punch out stoppers. He then positions the strip underneath a rotating tubular cutter and punches out the stoppers one at a time. This procedure is somewhat subjective and prone to error, being dependent on the judgement and accuracy of the operator. This paper describes the machine being developed jointly by Mecanova, Laboratorio Nacional de Engenharia e Tecnologia (LNETI) and Empresa de Investiga&sigmafcoe Desenvolvimento de Electronica SA (EID) which automatically processes cork strip introduced by an unskilled operator. The machine uses both image processing and laser inspection techniques to examine the strip. Defects in the cork are detected and categorised in order to determine regions where stoppers may be punched. The precise locations are then automatically optimised for best usage of the raw material (quantity and quality of stoppers). In order to achieve the required speed of production these image processing techniques may be implemented in hardware. The paper presents results obtained using the vision system software under development together with descriptions of both the image processing and mechanical aspects of the proposed machine.

  10. Literacy acquisition reduces the influence of automatic holistic processing of faces and houses.

    Science.gov (United States)

    Ventura, Paulo; Fernandes, Tânia; Cohen, Laurent; Morais, José; Kolinsky, Régine; Dehaene, Stanislas

    2013-10-25

    Writing was invented too recently to have influenced the human genome. Consequently, reading acquisition must rely on partial recycling of pre-existing brain systems. Prior fMRI evidence showed that in literates a left-hemispheric visual region increases its activation to written strings relative to illiterates and reduces its response to faces. Increasing literacy also leads to a stronger right-hemispheric lateralization for faces. Here, we evaluated whether this reorganization of the brain's face system has behavioral consequences for the processing of non-linguistic visual stimuli. Three groups of adult illiterates, ex-illiterates and literates were tested with the sequential composite face paradigm that evaluates the automaticity with which faces are processed as wholes. Illiterates were consistently more holistic than participants with reading experience in dealing with faces. A second experiment replicated this effect with both faces and houses. Brain reorganization induced by literacy seems to reduce the influence of automatic holistic processing of faces and houses by enabling the use of a more analytic and flexible processing strategy, at least when holistic processing is detrimental to the task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Development and evaluation of an automatic acne lesion detection program using digital image processing.

    Science.gov (United States)

    Min, Seonguk; Kong, Hyoun-joong; Yoon, Chiyul; Kim, Hee Chan; Suh, Dae Hun

    2013-02-01

    Existing acne grading methods, which depend on overall impression, require a long training period and there is a high degree of variability among raters, including trained dermatologists. The use of lesion count provides fair reproducibility but the method is time consuming. New technologies in photographic equipment and software allow solutions to the problem of acne evaluation. This study was conducted to develop the automatic acne lesion program and evaluation of its usefulness. We made the conditions to optimize characterization of acne lesions and developed the counting program. Twenty-five volunteers with acne lesions were enrolled. Automated lesion counting for five subtypes of acne (papule, nodule, pustule, whitehead comedone, and blackhead comedone) was performed with image processing. The usefulness of the automatic lesion count program was assessed by a comparison with manual counting performed by an expert dermatologist. In a comparison with manual counting performed by an expert dermatologist, the sensitivity and positive predictive value of the lesion-counting program was greater than 70% for papules, nodules, pustules, and whitehead comedo. In a comparison with manual counting, findings with the use of the lesion-counting program were well correlated for papules, nodules, pustules, and whitehead comedo (r > 0.9). Automatic lesion-counting program can be a useful tool for acne severity evaluation. © 2012 John Wiley & Sons A/S.

  12. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  13. Fully automatic characterization and data collection from crystals of biological macromolecules

    International Nuclear Information System (INIS)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W.

    2015-01-01

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention

  14. A New Automatic Method of Urban Areas Mapping in East Asia from LANDSAT Data

    Science.gov (United States)

    XU, R.; Jia, G.

    2012-12-01

    Cities, as places where human activities are concentrated, account for a small percent of global land cover but are frequently cited as the chief causes of, and solutions to, climate, biogeochemistry, and hydrology processes at local, regional, and global scales. Accompanying with uncontrolled economic growth, urban sprawl has been attributed to the accelerating integration of East Asia into the world economy and involved dramatic changes in its urban form and land use. To understand the impact of urban extent on biogeophysical processes, reliable mapping of built-up areas is particularly essential in eastern cities as a result of their characteristics of smaller patches, more fragile, and a lower fraction of the urban landscape which does not have natural than in the West. Segmentation of urban land from other land-cover types using remote sensing imagery can be done by standard classification processes as well as a logic rule calculation based on spectral indices and their derivations. Efforts to establish such a logic rule with no threshold for automatically mapping are highly worthwhile. Existing automatic methods are reviewed, and then a proposed approach is introduced including the calculation of the new index and the improved logic rule. Following this, existing automatic methods as well as the proposed approach are compared in a common context. Afterwards, the proposed approach is tested separately in cities of large, medium, and small scale in East Asia selected from different LANDSAT images. The results are promising as the approach can efficiently segment urban areas, even in the presence of more complex eastern cities. Key words: Urban extraction; Automatic Method; Logic Rule; LANDSAT images; East AisaThe Proposed Approach of Extraction of Urban Built-up Areas in Guangzhou, China

  15. Reliability Engineering for ATLAS Petascale Data Processing on the Grid

    CERN Document Server

    Golubkov, D V; The ATLAS collaboration; Vaniachine, A V

    2012-01-01

    The ATLAS detector is in its third year of continuous LHC running taking data for physics analysis. A starting point for ATLAS physics analysis is reconstruction of the raw data. First-pass processing takes place shortly after data taking, followed later by reprocessing of the raw data with updated software and calibrations to improve the quality of the reconstructed data for physics analysis. Data reprocessing involves a significant commitment of computing resources and is conducted on the Grid. The reconstruction of one petabyte of ATLAS data with 1B collision events from the LHC takes about three million core-hours. Petascale data processing on the Grid involves millions of data processing jobs. At such scales, the reprocessing must handle a continuous stream of failures. Automatic job resubmission recovers transient failures at the cost of CPU time used by the failed jobs. Orchestrating ATLAS data processing applications to ensure efficient usage of tens of thousands of CPU-cores, reliability engineering ...

  16. Automatic Web Data Extraction Based on Genetic Algorithms and Regular Expressions

    Science.gov (United States)

    Barrero, David F.; Camacho, David; R-Moreno, María D.

    Data Extraction from the World Wide Web is a well known, unsolved, and critical problem when complex information systems are designed. These problems are related to the extraction, management and reuse of the huge amount ofWeb data available. These data usually has a high heterogeneity, volatility and low quality (i.e. format and content mistakes), so it is quite hard to build reliable systems. This chapter proposes an Evolutionary Computation approach to the problem of automatically learn software entities based on Genetic Algorithms and regular expressions. These entities, also called wrappers, will be able to extract some kind of Web data structures from examples.

  17. Big Data in Market Research: Why More Data Does Not Automatically Mean Better Information

    Directory of Open Access Journals (Sweden)

    Bosch Volker

    2016-11-01

    Full Text Available Big data will change market research at its core in the long term because consumption of products and media can be logged electronically more and more, making it measurable on a large scale. Unfortunately, big data datasets are rarely representative, even if they are huge. Smart algorithms are needed to achieve high precision and prediction quality for digital and non-representative approaches. Also, big data can only be processed with complex and therefore error-prone software, which leads to measurement errors that need to be corrected. Another challenge is posed by missing but critical variables. The amount of data can indeed be overwhelming, but it often lacks important information. The missing observations can only be filled in by using statistical data imputation. This requires an additional data source with the additional variables, for example a panel. Linear imputation is a statistical procedure that is anything but trivial. It is an instrument to “transport information,” and the higher the observed data correlates with the data to be imputed, the better it works. It makes structures visible even if the depth of the data is limited.

  18. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.

  19. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    International Nuclear Information System (INIS)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described

  20. Information management data base for fusion target fabrication processes

    International Nuclear Information System (INIS)

    Reynolds, J.

    1983-01-01

    A computer-based data management system has been developed to handle data associated with target fabrication processes including glass microballoon characterization, gas filling, materials coating, and storage locations. The system provides automatic data storage and computation, flexible data entry procedures, fast access, automated report generation, and secure data transfer. It resides on a CDC CYBER 175 computer and is compatible with the CDC data base language Query Update, but is based on custom fortran software interacting directly with the CYBER's file management system. The described data base maintains detailed, accurate, and readily available records of fusion targets information

  1. Automatic detection, segmentation and assessment of snoring from ambient acoustic data.

    Science.gov (United States)

    Duckitt, W D; Tuomi, S K; Niesler, T R

    2006-10-01

    Snoring is a prevalent condition with a variety of negative social effects and associated health problems. Treatments, both surgical and therapeutic, have been developed, but the objective non-invasive monitoring of their success remains problematic. We present a method which allows the automatic monitoring of snoring characteristics, such as intensity and frequency, from audio data captured via a freestanding microphone. This represents a simple and portable diagnostic alternative to polysomnography. Our system is based on methods that have proved effective in the field of speech recognition. Hidden Markov models (HMMs) were employed as basic elements with which to model different types of sound by means of spectrally based features. This allows periods of snoring to be identified, while rejecting silence, breathing and other sounds. Training and test data were gathered from six subjects, and annotated appropriately. The system was tested by requiring it to automatically classify snoring sounds in new audio recordings and then comparing the result with manually obtained annotations. We found that our system was able to correctly identify snores with 82-89% accuracy, despite the small size of the training set. We could further demonstrate how this segmentation can be used to measure the snoring intensity, snoring frequency and snoring index. We conclude that a system based on hidden Markov models and spectrally based features is effective in the automatic detection and monitoring of snoring from audio data.

  2. Intentional and automatic processing of numerical information in mathematical anxiety: testing the influence of emotional priming.

    Science.gov (United States)

    Ashkenazi, Sarit

    2018-02-05

    Current theoretical approaches suggest that mathematical anxiety (MA) manifests itself as a weakness in quantity manipulations. This study is the first to examine automatic versus intentional processing of numerical information using the numerical Stroop paradigm in participants with high MA. To manipulate anxiety levels, we combined the numerical Stroop task with an affective priming paradigm. We took a group of college students with high MA and compared their performance to a group of participants with low MA. Under low anxiety conditions (neutral priming), participants with high MA showed relatively intact number processing abilities. However, under high anxiety conditions (mathematical priming), participants with high MA showed (1) higher processing of the non-numerical irrelevant information, which aligns with the theoretical view regarding deficits in selective attention in anxiety and (2) an abnormal numerical distance effect. These results demonstrate that abnormal, basic numerical processing in MA is context related.

  3. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  4. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception: An Event-Related Potential Study.

    Science.gov (United States)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan; Shi, Jiannong

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50-130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320-450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information.

  5. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception: An Event-Related Potential Study.

    Directory of Open Access Journals (Sweden)

    Tongran Liu

    Full Text Available The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8 and happy expressions were deviant stimuli (p = 0.2, and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8 and fearful expressions were deviant stimuli (p = 0.2. Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs were obtained during the tasks. The visual mismatch negativity (vMMN components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50-130 ms, the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320-450 ms, the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information.

  6. Data base structure and Management for Automatic Calculation of 210Pb Dating Methods Applying Different Models

    International Nuclear Information System (INIS)

    Gasco, C.; Anton, M. P.; Ampudia, J.

    2003-01-01

    The introduction of macros in try calculation sheets allows the automatic application of various dating models using unsupported ''210 Pb data from a data base. The calculation books the contain the models have been modified to permit the implementation of these macros. The Marine and Aquatic Radioecology group of CIEMAT (MARG) will be involved in new European Projects, thus new models have been developed. This report contains a detailed description of: a) the new implement macros b) the design of a dating Menu in the calculation sheet and c) organization and structure of the data base. (Author) 4 refs

  7. Improving SAR Automatic Target Recognition Models with Transfer Learning from Simulated Data

    DEFF Research Database (Denmark)

    Malmgren-Hansen, David; Kusk, Anders; Dall, Jørgen

    2017-01-01

    SAR images of sufficient size, simulated data play a big role in SAR ATR development, but the transferability of knowledge learned on simulated data to real data remains to be studied further. In this letter, we show the first study of Transfer Learning between a simulated data set and a set of real....... These results encourage SAR ATR development to continue the improvement of simulated data sets of greater size and complex scenarios in order to build robust algorithms for real life SAR ATR applications.......Data-driven classification algorithms have proved to do well for automatic target recognition (ATR) in synthetic aperture radar (SAR) data. Collecting data sets suitable for these algorithms is a challenge in itself as it is difficult and expensive. Due to the lack of labeled data sets with real...

  8. Oscillatory brain dynamics associated with the automatic processing of emotion in words.

    Science.gov (United States)

    Wang, Lin; Bastiaansen, Marcel

    2014-10-01

    This study examines the automaticity of processing the emotional aspects of words, and characterizes the oscillatory brain dynamics that accompany this automatic processing. Participants read emotionally negative, neutral and positive nouns while performing a color detection task in which only perceptual-level analysis was required. Event-related potentials and time frequency representations were computed from the concurrently measured EEG. Negative words elicited a larger P2 and a larger late positivity than positive and neutral words, indicating deeper semantic/evaluative processing of negative words. In addition, sustained alpha power suppressions were found for the emotional compared to neutral words, in the time range from 500 to 1000ms post-stimulus. These results suggest that sustained attention was allocated to the emotional words, whereas the attention allocated to the neutral words was released after an initial analysis. This seems to hold even when the emotional content of the words is task-irrelevant. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Semantic priming in young and older adults: evidence for age constancy in automatic and attentional processes.

    Science.gov (United States)

    Burke, D M; White, H; Diaz, D L

    1987-02-01

    Automatic and attentional components of semantic priming and the relation of each to episodic memory were evaluated in young and older adults. Category names served as prime words, and the relatedness of the prime to a subsequent lexical decision target was varied orthogonally with whether the target category was expected or unexpected. At a prime-target stimulus-onset asynchrony (SOA) of 410 ms, target words in the same category had faster lexical decision latencies than did different category targets. This effect was not significant at a 1,550-ms SOA and was attributed to automatic processes. Expected category targets had faster latencies than unexpected category targets at the 410-ms SOA, and the magnitude of the effect increased at the 1,550-ms SOA. This effect was attributed to attentional processes. These patterns of priming were obtained for both age groups, but in a surprise memory test older adults had poorer recall of primes and targets. We discuss the implications of these results for the hypothesis that older adults suffer deficits in selective attention and for the related hypothesis that attentional deficits impair semantic processing, which causes memory decrements in old age.

  10. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald

    2013-01-10

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization. Processing terrain data has not received the attention of other forms of surface reconstruction or of image processing. The goal of terrain data processing is to convert the point cloud into a succinct representation system that is amenable to the various application demands. The present paper presents a platform for terrain processing built on the following principles: (i) measuring distortion in the Hausdorff metric, which we argue is a good match for the application demands, (ii) a multiscale representation based on tree approximation using local polynomial fitting. The basic elements held in the nodes of the tree can be efficiently encoded, transmitted, visualized, and utilized for the various target applications. Several challenges emerge because of the variable resolution of the data, missing data, occlusions, and noise. Techniques for identifying and handling these challenges are developed. © 2013 Society for Industrial and Applied Mathematics.

  11. VISA: AN AUTOMATIC AWARE AND VISUAL AIDS MECHANISM FOR IMPROVING THE CORRECT USE OF GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    J. H. Hong

    2016-06-01

    Full Text Available With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of “differences” implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  12. Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data

    Science.gov (United States)

    Hong, J. H.; Su, Y. T.

    2016-06-01

    With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  13. Communicating Processes with Data for Supervisory Coordination

    Directory of Open Access Journals (Sweden)

    Jasen Markovski

    2012-08-01

    Full Text Available We employ supervisory controllers to safely coordinate high-level discrete(-event behavior of distributed components of complex systems. Supervisory controllers observe discrete-event system behavior, make a decision on allowed activities, and communicate the control signals to the involved parties. Models of the supervisory controllers can be automatically synthesized based on formal models of the system components and a formalization of the safe coordination (control requirements. Based on the obtained models, code generation can be used to implement the supervisory controllers in software, on a PLC, or an embedded (microprocessor. In this article, we develop a process theory with data that supports a model-based systems engineering framework for supervisory coordination. We employ communication to distinguish between the different flows of information, i.e., observation and supervision, whereas we employ data to specify the coordination requirements more compactly, and to increase the expressivity of the framework. To illustrate the framework, we remodel an industrial case study involving coordination of maintenance procedures of a printing process of a high-tech Oce printer.

  14. Fast data processing with Spark

    CERN Document Server

    Sankar, Krishna

    2015-01-01

    Fast Data Processing with Spark - Second Edition is for software developers who want to learn how to write distributed programs with Spark. It will help developers who have had problems that were too big to be dealt with on a single computer. No previous experience with distributed programming is necessary. This book assumes knowledge of either Java, Scala, or Python.

  15. Career Education via Data Processing

    Science.gov (United States)

    Wagner, Gerald E.

    1975-01-01

    A data processing instructional program should provide students with career awareness, exploration, and orientation. This can be accomplished by establishing three objectives: (1) familiarization with automation terminology; (2) understanding the influence of the cultural and social impact of computers and automation; and (3) the kinds of job…

  16. The Euclid Data Processing Challenges

    Science.gov (United States)

    Dubath, Pierre; Apostolakos, Nikolaos; Bonchi, Andrea; Belikov, Andrey; Brescia, Massimo; Cavuoti, Stefano; Capak, Peter; Coupon, Jean; Dabin, Christophe; Degaudenzi, Hubert; Desai, Shantanu; Dubath, Florian; Fontana, Adriano; Fotopoulou, Sotiria; Frailis, Marco; Galametz, Audrey; Hoar, John; Holliman, Mark; Hoyle, Ben; Hudelot, Patrick; Ilbert, Olivier; Kuemmel, Martin; Melchior, Martin; Mellier, Yannick; Mohr, Joe; Morisset, Nicolas; Paltani, Stéphane; Pello, Roser; Pilo, Stefano; Polenta, Gianluca; Poncet, Maurice; Saglia, Roberto; Salvato, Mara; Sauvage, Marc; Schefer, Marc; Serrano, Santiago; Soldati, Marco; Tramacere, Andrea; Williams, Rees; Zacchei, Andrea

    2017-06-01

    Euclid is a Europe-led cosmology space mission dedicated to a visible and near infrared survey of the entire extra-galactic sky. Its purpose is to deepen our knowledge of the dark content of our Universe. After an overview of the Euclid mission and science, this contribution describes how the community is getting organized to face the data analysis challenges, both in software development and in operational data processing matters. It ends with a more specific account of some of the main contributions of the Swiss Science Data Center (SDC-CH).

  17. Automatic rice crop height measurement using a field server and digital image processing.

    Science.gov (United States)

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-07

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required.

  18. Reliability modelling of redundant safety systems without automatic diagnostics incorporating common cause failures and process demand.

    Science.gov (United States)

    Alizadeh, Siamak; Sriramula, Srinivas

    2017-11-01

    Redundant safety systems are commonly used in the process industry to respond to hazardous events. In redundant systems composed of identical units, Common Cause Failures (CCFs) can significantly influence system performance with regards to reliability and safety. However, their impact has been overlooked due to the inherent complexity of modelling common cause induced failures. This article develops a reliability model for a redundant safety system using Markov analysis approach. The proposed model incorporates process demands in conjunction with CCF for the first time and evaluates their impacts on the reliability quantification of safety systems without automatic diagnostics. The reliability of the Markov model is quantified by considering the Probability of Failure on Demand (PFD) as a measure for low demand systems. The safety performance of the model is analysed using Hazardous Event Frequency (HEF) to evaluate the frequency of entering a hazardous state that will lead to an accident if the situation is not controlled. The utilisation of Markov model for a simple case study of a pressure protection system is demonstrated and it is shown that the proposed approach gives a sufficiently accurate result for all demand rates, durations, component failure rates and corresponding repair rates for low demand mode of operation. The Markov model proposed in this paper assumes the absence of automatic diagnostics, along with multiple stage repair strategy for CCFs and restoration of the system from hazardous state to the "as good as new" state. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Automatic Rice Crop Height Measurement Using a Field Server and Digital Image Processing

    Science.gov (United States)

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-01

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required. PMID:24451465

  20. Type-assisted automatic garbage collection for lock-free data structures

    OpenAIRE

    Yang, Albert Mingkun; Wrigstad, Tobias

    2017-01-01

    We introduce Isolde, an automatic garbage collection scheme designed specifically for managing memory in lock-free data structures, such as stacks, lists, maps and queues. Isolde exists as a plug-in memory manager, designed to sit on-top of another memory manager, and use it's allocator and reclaimer (if exists). Isolde treats a lock-free data structure as a logical heap, isolated from the rest of the program. This allows garbage collection outside of Isolde to take place without affecting th...

  1. A method for unsupervised change detection and automatic radiometric normalization in multispectral data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Canty, Morton John

    2011-01-01

    Based on canonical correlation analysis the iteratively re-weighted multivariate alteration detection (MAD) method is used to successfully perform unsupervised change detection in bi-temporal Landsat ETM+ images covering an area with villages, woods, agricultural fields and open pit mines in North...... Rhine- Westphalia, Germany. A link to an example with ASTER data to detect change with the same method after the 2005 Kashmir earthquake is given. The method is also used to automatically normalize multitemporal, multispectral Landsat ETM+ data radiometrically. IDL/ENVI, Python and Matlab software...

  2. Data Quality Monitoring : Automatic MOnitoRing Environment (AMORE ) Web Administration Tool in ALICE Experiment

    CERN Document Server

    Nagi, Imre

    2013-01-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The quality of the acquired data evolves over time depending on the status of the detectors, its components and the operating environment. To get an excellent performance of detector, all detector configurations have to be set perfectly so that the data-taking can be done in an optimal way. This report describes a new implementation of the administration tools of the ALICE’s DQM framework called AMORE (Automatic MonitoRing Environment) with web technologies.

  3. Automatable algorithms to identify nonmedical opioid use using electronic data: a systematic review.

    Science.gov (United States)

    Canan, Chelsea; Polinski, Jennifer M; Alexander, G Caleb; Kowal, Mary K; Brennan, Troyen A; Shrank, William H

    2017-11-01

    Improved methods to identify nonmedical opioid use can help direct health care resources to individuals who need them. Automated algorithms that use large databases of electronic health care claims or records for surveillance are a potential means to achieve this goal. In this systematic review, we reviewed the utility, attempts at validation, and application of such algorithms to detect nonmedical opioid use. We searched PubMed and Embase for articles describing automatable algorithms that used electronic health care claims or records to identify patients or prescribers with likely nonmedical opioid use. We assessed algorithm development, validation, and performance characteristics and the settings where they were applied. Study variability precluded a meta-analysis. Of 15 included algorithms, 10 targeted patients, 2 targeted providers, 2 targeted both, and 1 identified medications with high abuse potential. Most patient-focused algorithms (67%) used prescription drug claims and/or medical claims, with diagnosis codes of substance abuse and/or dependence as the reference standard. Eleven algorithms were developed via regression modeling. Four used natural language processing, data mining, audit analysis, or factor analysis. Automated algorithms can facilitate population-level surveillance. However, there is no true gold standard for determining nonmedical opioid use. Users must recognize the implications of identifying false positives and, conversely, false negatives. Few algorithms have been applied in real-world settings. Automated algorithms may facilitate identification of patients and/or providers most likely to need more intensive screening and/or intervention for nonmedical opioid use. Additional implementation research in real-world settings would clarify their utility. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  4. Data-driven automatic parking constrained control for four-wheeled mobile vehicles

    Directory of Open Access Journals (Sweden)

    Wenxu Yan

    2016-11-01

    Full Text Available In this article, a novel data-driven constrained control scheme is proposed for automatic parking systems. The design of the proposed scheme only depends on the steering angle and the orientation angle of the car, and it does not involve any model information of the car. Therefore, the proposed scheme-based automatic parking system is applicable to different kinds of cars. In order to further reduce the desired trajectory coordinate tracking errors, a coordinates compensation algorithm is also proposed. In the design procedure of the controller, a novel dynamic anti-windup compensator is used to deal with the change magnitude and rate saturations of automatic parking control input. It is theoretically proven that all the signals in the closed-loop system are uniformly ultimately bounded based on Lyapunov stability analysis method. Finally, a simulation comparison among the proposed scheme with coordinates compensation and Proportion Integration Differentiation (PID control algorithm is given. It is shown that the proposed scheme with coordinates compensation has smaller tracking errors and more rapid responses than PID scheme.

  5. GIS Data Based Automatic High-Fidelity 3D Road Network Modeling

    Science.gov (United States)

    Wang, Jie; Shen, Yuzhong

    2011-01-01

    3D road models are widely used in many computer applications such as racing games and driving simulations_ However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially those existing in the real world. This paper presents a novel approach thai can automatically produce 3D high-fidelity road network models from real 2D road GIS data that mainly contain road. centerline in formation. The proposed method first builds parametric representations of the road centerlines through segmentation and fitting . A basic set of civil engineering rules (e.g., cross slope, superelevation, grade) for road design are then selected in order to generate realistic road surfaces in compliance with these rules. While the proposed method applies to any types of roads, this paper mainly addresses automatic generation of complex traffic interchanges and intersections which are the most sophisticated elements in the road networks

  6. Automatic assessment of functional health decline in older adults based on smart home data.

    Science.gov (United States)

    Aramendi, Ane Alberdi; Weakley, Alyssa; Goenaga, Asier Aztiria; Schmitter-Edgecombe, Maureen; Cook, Diane J

    2018-03-15

    In the context of an aging population, tools to help elderly to live independently must be developed. The goal of this paper is to evaluate the possibility of using unobtrusively collected activity-aware smart home behavioral data to automatically detect one of the most common consequences of aging: functional health decline. After gathering the longitudinal smart home data of 29 older adults for an average of > 2 years, we automatically labeled the data with corresponding activity classes and extracted time-series statistics containing 10 behavioral features. Using this data, we created regression models to predict absolute and standardized functional health scores, as well as classification models to detect reliable absolute change and positive and negative fluctuations in everyday functioning. Functional health was assessed every six months by means of the Instrumental Activities of Daily Living-Compensation (IADL-C) scale. Results show that total IADL-C score and subscores can be predicted by means of activity-aware smart home data, as well as a reliable change in these scores. Positive and negative fluctuations in everyday functioning are harder to detect using in-home behavioral data, yet changes in social skills have shown to be predictable. Future work must focus on improving the sensitivity of the presented models and performing an in-depth feature selection to improve overall accuracy. Copyright © 2018. Published by Elsevier Inc.

  7. Structured illumination microscopy and automatized image processing as a rapid diagnostic tool for podocyte effacement.

    Science.gov (United States)

    Siegerist, Florian; Ribback, Silvia; Dombrowski, Frank; Amann, Kerstin; Zimmermann, Uwe; Endlich, Karlhans; Endlich, Nicole

    2017-09-13

    The morphology of podocyte foot processes is obligatory for renal function. Here we describe a method for the superresolution-visualization of podocyte foot processes using structured illumination microscopy of the slit diaphragm, which before has only been achieved by electron microscopy. As a proof of principle, we measured a mean foot process width of 0.249 ± 0.068 µm in healthy kidneys and a significant higher mean foot process width of 0.675 ± 0.256 µm in minimal change disease patients indicating effacement of foot processes. We then hypothesized that the slit length per glomerular capillary surface area (slit diaphragm density) could be used as an equivalent for the diagnosis of effacement. Using custom-made software we measured a mean value of 3.10 ± 0.27 µm -1 in healthy subjects and 1.83 ± 0.49 µm -1 in the minimal change disease patients. As foot process width was highly correlated with slit diaphragm density (R 2  = 0.91), we concluded that our approach is a valid method for the diagnosis of foot process effacement. In summary, we present a new technique to quantify podocyte damage, which combines superresolution microscopy with automatized image processing. Due to its diverse advantages, we propose this technique to be included into routine diagnostics of glomerular histopathology.

  8. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  9. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  10. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  11. Automatic mesh generation for structural analysis of pressure vessels using fuzzy knowledge processing

    International Nuclear Information System (INIS)

    Kado, Kenichiro; Sato, Takuya; Yoshimura, Shinobu; Yagawa, Genki.

    1994-01-01

    This paper describes the automatic mesh generation system for 2D axisymmetric and 3D shell structures based on the fuzzy knowledge processing. In this system, an analysis model, i.e. a geometric model, is first defined using a conventional method for 2D structures and a commercial CAD system, Auto-CAD, for 3D shell structures. Nodes are then generated based on the fuzzy knowledge processing technique, well controlling the node density distribution over the whole analysis domain. Triangular elements are generated using the Delaunay triangulation technique. The triangular elements are converted to quadrilateral elements. The fundamental performances of the system are demonstrated through its application to typical components of a pressure vessel. (author)

  12. Long-term abacus training induces automatic processing of abacus numbers in children.

    Science.gov (United States)

    Du, Fenglei; Yao, Yuan; Zhang, Qiong; Chen, Feiyan

    2014-01-01

    Abacus-based mental calculation (AMC) is a unique strategy for arithmetic that is based on the mental abacus. AMC experts can solve calculation problems with extraordinarily fast speed and high accuracy. Previous studies have demonstrated that abacus experts showed superior performance and special neural correlates during numerical tasks. However, most of those studies focused on the perception and cognition of Arabic numbers. It remains unclear how the abacus numbers were perceived. By applying a similar enumeration Stroop task, in which participants are presented with a visual display containing two abacus numbers and asked to compare the numerosity of beads that consisted of the abacus number, in the present study we investigated the automatic processing of the numerical value of abacus numbers in abacus-trained children. The results demonstrated a significant congruity effect in the numerosity comparison task for abacus-trained children, in both reaction time and error rate analysis. These results suggested that the numerical value of abacus numbers was perceived automatically by the abacus-trained children after long-term training.

  13. Automatic design of decision-tree induction algorithms tailored to flexible-receptor docking data.

    Science.gov (United States)

    Barros, Rodrigo C; Winck, Ana T; Machado, Karina S; Basgalupp, Márcio P; de Carvalho, André C P L F; Ruiz, Duncan D; de Souza, Osmar Norberto

    2012-11-21

    This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor.

  14. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...... the experimentally acquired data are used as the driving forces. The framework performs the segmentation in 3D rather than on a slice by slice basis. It naturally supplies sub-voxel precision of segmented surfaces and allows constraints on the surface curvature to enforce a smooth surface in the segmentation. Two...

  15. Automatic Testing of the Trigger Data Serializer ASIC for the Upgrade of the ATLAS Muon Spectrometer

    CERN Document Server

    Pinkham, Reid; Schwarz, Thomas

    The Trigger Data Serializer (TDS) is a custom designed Application Specific Integrated Circuit (ASIC) designed at the University of Michigan to be used on the ATLAS New Small Wheel (NSW) detector. The TDS is a central hub of the NSW trigger system. It prepares the trigger data for both pad and strip detectors, performs pad-strip matching, and serializes the matched strip data to other circuits on the rim of the NSW. In total, 6000 TDS chips will be produced. As part of the TDS’ initial production run, a test platform was developed to verify the functionality of each chip before being sent to users. The test platform consisted of multiple FPGA evaluation boards with custom designed mezzanine boards to hold the TDS chip during testing and control software running on a local computer. Of the initial run of 200 chips, 161 chips were tested with the automatic setup of which 158 passed. Detailed description of the TDS and automatic test fixture can be found in this thesis.

  16. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Science.gov (United States)

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Developing an Intelligent Automatic Appendix Extraction Method from Ultrasonography Based on Fuzzy ART and Image Processing

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2015-01-01

    Full Text Available Ultrasound examination (US does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases in extracting appendix.

  18. DETERMINATION OF THE UAV POSITION BY AUTOMATIC PROCESSING OF THERMAL IMAGES

    Directory of Open Access Journals (Sweden)

    W. Hartmann

    2012-07-01

    Full Text Available If images acquired from Unmanned Aerial Vehicles (UAVs need to be accurately geo-referenced, the method of choice is classical aerotriangulation, since on-board sensors are usually not accurate enough for direct geo-referencing. For several different applications it has recently been proposed to mount thermal cameras on UAVs. Compared to optical images, thermal ones pose a number of challenges, in particular low resolution and weak local contrast. In this work we investigate the automatic orientation of thermal image blocks acquired from a UAV, using artificial ground control points. To that end we adapt the photogrammetric processing pipeline to thermal imagery. The pipeline achieves accuracies of about ±1 cm in planimetry and ±3 cm in height for the object points, respectively ±10 cm or better for the camera positions, compared to ±100 cm or worse for direct geo-referencing using on-board single-frequency GPS.

  19. Application of digital process controller for automatic pulse operation in the NSRR

    International Nuclear Information System (INIS)

    Ishijima, K.; Ueda, T.; Saigo, M.

    1992-01-01

    The NSRR at JAERI is a modified TRIGA Reactor. It was built for investigating reactor fuel behavior under reactivity initiated accident (RIA) conditions. Recently, there has been a need to improve the flexibility of pulsing operations in the NSRR to cover a wide range of accidental situations, including RIA events at elevated power levels, and various abnormal power transients. To satisfy this need, we developed a new reactor control system which allows us to perform 'Shaped Pulse Operation: SP' and 'Combined Pulse Operation: CP'. Quick, accurate and complicated manipulation of control rods was required to realize these operations. Therefore we installed a new reactor control system, which we call an automatic pulse control system. This control system is composed of digital processing controllers and other digital equipments, and is fully automated and highly accurate. (author)

  20. Stereotype threat strengthens automatic recall and undermines controlled processes in older adults.

    Science.gov (United States)

    Mazerolle, Marie; Régner, Isabelle; Morisset, Pauline; Rigalleau, François; Huguet, Pascal

    2012-07-01

    The threat of being judged stereotypically (stereotype threat) may impair memory performance in older adults, thereby producing inflated age differences in memory tasks. However, the underlying mechanisms of stereotype threat in older adults or other stigmatized groups remain poorly understood. Here, we offer evidence that stereotype threat consumes working memory resources in older adults. More important, using a process-dissociation procedure, we found, for the first time, that stereotype threat undermines the controlled use of memory and simultaneously intensifies automatic response tendencies. These findings indicate that competing models of stereotype threat are actually compatible and offer further reasons for researchers and practitioners to pay special attention to age-related stereotypes during standardized neuropsychological testing.

  1. Neural dynamics of morphological processing in spoken word comprehension: Laterality and automaticity

    Directory of Open Access Journals (Sweden)

    Caroline M. Whiting

    2013-11-01

    Full Text Available Rapid and automatic processing of grammatical complexity is argued to take place during speech comprehension, engaging a left-lateralised fronto-temporal language network. Here we address how neural activity in these regions is modulated by the grammatical properties of spoken words. We used combined magneto- and electroencephalography (MEG, EEG to delineate the spatiotemporal patterns of activity that support the recognition of morphologically complex words in English with inflectional (-s and derivational (-er affixes (e.g. bakes, baker. The mismatch negativity (MMN, an index of linguistic memory traces elicited in a passive listening paradigm, was used to examine the neural dynamics elicited by morphologically complex words. Results revealed an initial peak 130-180 ms after the deviation point with a major source in left superior temporal cortex. The localisation of this early activation showed a sensitivity to two grammatical properties of the stimuli: 1 the presence of morphological complexity, with affixed words showing increased left-laterality compared to non-affixed words; and 2 the grammatical category, with affixed verbs showing greater left-lateralisation in inferior frontal gyrus compared to affixed nouns (bakes vs. beaks. This automatic brain response was additionally sensitive to semantic coherence (the meaning of the stem vs. the meaning of the whole form in fronto-temporal regions. These results demonstrate that the spatiotemporal pattern of neural activity in spoken word processing is modulated by the presence of morphological structure, predominantly engaging the left-hemisphere’s fronto-temporal language network, and does not require focused attention on the linguistic input.

  2. New data for nucleosynthesis processes

    International Nuclear Information System (INIS)

    Trache, L.; Tribble, R.E.; Mukhamedzhanov

    1996-01-01

    The question of obtaining data needed in nuclear astrophysics using alternative methods to the direct measurement of the cross section for the reactions and energies involved in stellar nuclear reactions is discussed for a few specific examples. The stellar reaction rate for the radiative proton capture on 56 Ni, important in the rp-process is obtained from the study of excited states in 57 Cu with MARS. Use of Coulomb dissociation and alpha transfer reactions to extract data for the radiative alpha capture on 12 C and 14 C and finally the use of the Asymptotic Normalization Coefficient method and the few proposed experiments including one with a 7 Be radioactive beam at the K500 superconducting cyclotron at TAMU are briefly discussed. (authors)

  3. Study of Burn Scar Extraction Automatically Based on Level Set Method using Remote Sensing Data

    Science.gov (United States)

    Liu, Yang; Dai, Qin; Liu, JianBo; Liu, ShiBin; Yang, Jin

    2014-01-01

    Burn scar extraction using remote sensing data is an efficient way to precisely evaluate burn area and measure vegetation recovery. Traditional burn scar extraction methodologies have no well effect on burn scar image with blurred and irregular edges. To address these issues, this paper proposes an automatic method to extract burn scar based on Level Set Method (LSM). This method utilizes the advantages of the different features in remote sensing images, as well as considers the practical needs of extracting the burn scar rapidly and automatically. This approach integrates Change Vector Analysis (CVA), Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR) to obtain difference image and modifies conventional Level Set Method Chan-Vese (C-V) model with a new initial curve which results from a binary image applying K-means method on fitting errors of two near-infrared band images. Landsat 5 TM and Landsat 8 OLI data sets are used to validate the proposed method. Comparison with conventional C-V model, OSTU algorithm, Fuzzy C-mean (FCM) algorithm are made to show that the proposed approach can extract the outline curve of fire burn scar effectively and exactly. The method has higher extraction accuracy and less algorithm complexity than that of the conventional C-V model. PMID:24503563

  4. Automatic data-acquisition and communications computer network for fusion experiments

    International Nuclear Information System (INIS)

    Kemper, C.O.

    1981-01-01

    A network of more than twenty computers serves the data acquisition, archiving, and analysis requirements of the ISX, EBT, and beam-line test facilities at the Fusion Division of Oak Ridge National Laboratory. The network includes PDP-8, PDP-12, PDP-11, PDP-10, and Interdata 8-32 processors, and is unified by a variety of high-speed serial and parallel communications channels. While some processors are dedicated to experimental data acquisition, and others are dedicated to later analysis and theoretical work, many processors perform a combination of acquisition, real-time analysis and display, and archiving and communications functions. A network software system has been developed which runs in each processor and automatically transports data files from point of acquisition to point or points of analysis, display, and storage, providing conversion and formatting functions are required

  5. The Effect of Orthographic Depth on Letter String Processing: The Case of Visual Attention Span and Rapid Automatized Naming

    Science.gov (United States)

    Antzaka, Alexia; Martin, Clara; Caffarra, Sendy; Schlöffel, Sophie; Carreiras, Manuel; Lallier, Marie

    2018-01-01

    The present study investigated whether orthographic depth can increase the bias towards multi-letter processing in two reading-related skills: visual attention span (VAS) and rapid automatized naming (RAN). VAS (i.e., the number of visual elements that can be processed at once in a multi-element array) was tested with a visual 1-back task and RAN…

  6. Automatic vehicle detection based on automatic histogram-based fuzzy C-means algorithm and perceptual grouping using very high-resolution aerial imagery and road vector data

    Science.gov (United States)

    Ghaffarian, Saman; Gökaşar, Ilgın

    2016-01-01

    This study presents an approach for the automatic detection of vehicles using very high-resolution images and road vector data. Initially, road vector data and aerial images are integrated to extract road regions. Then, the extracted road/street region is clustered using an automatic histogram-based fuzzy C-means algorithm, and edge pixels are detected using the Canny edge detector. In order to automatically detect vehicles, we developed a local perceptual grouping approach based on fusion of edge detection and clustering outputs. To provide the locality, an ellipse is generated using characteristics of the candidate clusters individually. Then, ratio of edge pixels to nonedge pixels in the corresponding ellipse is computed to distinguish the vehicles. Finally, a point-merging rule is conducted to merge the points that satisfy a predefined threshold and are supposed to denote the same vehicles. The experimental validation of the proposed method was carried out on six very high-resolution aerial images that illustrate two highways, two shadowed roads, a crowded narrow street, and a street in a dense urban area with crowded parked vehicles. The evaluation of the results shows that our proposed method performed 86% and 83% in overall correctness and completeness, respectively.

  7. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  8. AN AUTOMATIC PROCEDURE FOR COMBINING DIGITAL IMAGES AND LASER SCANNER DATA

    Directory of Open Access Journals (Sweden)

    W. Moussa

    2012-07-01

    Full Text Available Besides improving both the geometry and the visual quality of the model, the integration of close-range photogrammetry and terrestrial laser scanning techniques directs at filling gaps in laser scanner point clouds to avoid modeling errors, reconstructing more details in higher resolution and recovering simple structures with less geometric details. Thus, within this paper a flexible approach for the automatic combination of digital images and laser scanner data is presented. Our approach comprises two methods for data fusion. The first method starts by a marker-free registration of digital images based on a point-based environment model (PEM of a scene which stores the 3D laser scanner point clouds associated with intensity and RGB values. The PEM allows the extraction of accurate control information for the direct computation of absolute camera orientations with redundant information by means of accurate space resection methods. In order to use the computed relations between the digital images and the laser scanner data, an extended Helmert (seven-parameter transformation is introduced and its parameters are estimated. Precedent to that, in the second method, the local relative orientation parameters of the camera images are calculated by means of an optimized Structure and Motion (SaM reconstruction method. Then, using the determined transformation parameters results in having absolute oriented images in relation to the laser scanner data. With the resulting absolute orientations we have employed robust dense image reconstruction algorithms to create oriented dense image point clouds, which are automatically combined with the laser scanner data to form a complete detailed representation of a scene. Examples of different data sets are shown and experimental results demonstrate the effectiveness of the presented procedures.

  9. Automatic detection of referral patients due to retinal pathologies through data mining.

    Science.gov (United States)

    Quellec, Gwenolé; Lamard, Mathieu; Erginay, Ali; Chabouis, Agnès; Massin, Pascale; Cochener, Béatrice; Cazuguel, Guy

    2016-04-01

    With the increased prevalence of retinal pathologies, automating the detection of these pathologies is becoming more and more relevant. In the past few years, many algorithms have been developed for the automated detection of a specific pathology, typically diabetic retinopathy, using eye fundus photography. No matter how good these algorithms are, we believe many clinicians would not use automatic detection tools focusing on a single pathology and ignoring any other pathology present in the patient's retinas. To solve this issue, an algorithm for characterizing the appearance of abnormal retinas, as well as the appearance of the normal ones, is presented. This algorithm does not focus on individual images: it considers examination records consisting of multiple photographs of each retina, together with contextual information about the patient. Specifically, it relies on data mining in order to learn diagnosis rules from characterizations of fundus examination records. The main novelty is that the content of examination records (images and context) is characterized at multiple levels of spatial and lexical granularity: 1) spatial flexibility is ensured by an adaptive decomposition of composite retinal images into a cascade of regions, 2) lexical granularity is ensured by an adaptive decomposition of the feature space into a cascade of visual words. This multigranular representation allows for great flexibility in automatically characterizing normality and abnormality: it is possible to generate diagnosis rules whose precision and generalization ability can be traded off depending on data availability. A variation on usual data mining algorithms, originally designed to mine static data, is proposed so that contextual and visual data at adaptive granularity levels can be mined. This framework was evaluated in e-ophtha, a dataset of 25,702 examination records from the OPHDIAT screening network, as well as in the publicly-available Messidor dataset. It was successfully

  10. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  11. Racial bias in pain perception and response: experimental examination of automatic and deliberate processes

    Science.gov (United States)

    Mathur, Vani A.; Richeson, Jennifer A.; Paice, Judith A.; Muzyka, Michael; Chiao, Joan Y.

    2014-01-01

    Racial disparities in pain treatment pose a significant public health and scientific problem. Prior studies demonstrate clinicians and non-clinicians are less perceptive, and suggest less treatment for, the pain of African Americans, relative to European Americans. Here we investigate the effects of explicit/implicit patient race presentation, patient race, and perceiver race on pain perception and response. African American and European American participants rated pain perception, empathy, helping motivation, and treatment suggestion in response to vignettes about patients’ pain. Vignettes were accompanied by a rapid (implicit), or static (explicit) presentation of an African or European American patient’s face. Participants perceived and responded more to European American patients in the implicit prime condition, when the effect of patient race was below the level of conscious regulation. This effect was reversed when patient race was presented explicitly. Additionally, female participants perceived and responded more to the pain of all patients, relative to male participants, and in the implicit prime condition, African American participants were more perceptive and responsive than European Americans to the pain of all patients. Taken together, these results suggest that known disparities in pain treatment may be largely due to automatic (below the level of conscious regulation), rather than deliberate (subject to conscious regulation) biases. These biases were not associated with traditional implicit measures of racial attitudes, suggesting that biases in pain perception and response may be independent of general prejudice. Perspective Results suggest racial biases in pain perception and treatment are at least partially due to automatic processes. When the relevance of patient race is made explicit, however, biases are attenuated and even reversed. We also find preliminary evidence that African Americans may be more sensitive to the pain of others than

  12. Gap-free segmentation of vascular networks with automatic image processing pipeline.

    Science.gov (United States)

    Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas

    2017-03-01

    Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Racial bias in pain perception and response: experimental examination of automatic and deliberate processes.

    Science.gov (United States)

    Mathur, Vani A; Richeson, Jennifer A; Paice, Judith A; Muzyka, Michael; Chiao, Joan Y

    2014-05-01

    Racial disparities in pain treatment pose a significant public health and scientific problem. Prior studies have demonstrated that clinicians and nonclinicians are less perceptive of, and suggest less treatment for, the pain of African Americans relative to European Americans. Here we investigate the effects of explicit/implicit patient race presentation, patient race, and perceiver race on pain perception and response. African American and European American participants rated pain perception, empathy, helping motivation, and treatment suggestion in response to vignettes about patients' pain. Vignettes were accompanied by a rapid (implicit) or static (explicit) presentation of an African or European American patient's face. Participants perceived and responded more to European American patients in the implicit prime condition, when the effect of patient race was below the level of conscious regulation. This effect was reversed when patient race was presented explicitly. Additionally, female participants perceived and responded more to the pain of all patients, relative to male participants, and in the implicit prime condition, African American participants were more perceptive and responsive than European Americans to the pain of all patients. Taken together, these results suggest that known disparities in pain treatment may be largely due to automatic (below the level of conscious regulation) rather than deliberate (subject to conscious regulation) biases. These biases were not associated with traditional implicit measures of racial attitudes, suggesting that biases in pain perception and response may be independent of general prejudice. Results suggest that racial biases in pain perception and treatment are at least partially due to automatic processes. When the relevance of patient race is made explicit, however, biases are attenuated and even reversed. We also find preliminary evidence that African Americans may be more sensitive to the pain of others than are

  14. The role of automaticity and attention in neural processes underlying empathy for happiness, sadness, and anxiety

    Directory of Open Access Journals (Sweden)

    Sylvia A. Morelli

    2013-05-01

    Full Text Available Although many studies have examined the neural basis of experiencing empathy, relatively little is known about how empathic processes are affected by different attentional conditions. Thus, we examined whether instructions to empathize might amplify responses in empathy-related regions and whether cognitive load would diminish the involvement of these regions. 32 participants completed a functional magnetic resonance imaging session assessing empathic responses to individuals experiencing happy, sad, and anxious events. Stimuli were presented under three conditions: watching naturally, while instructed to empathize, and under cognitive load. Across analyses, we found evidence for a core set of neural regions that support empathic processes (dorsomedial prefrontal cortex, DMPFC; medial prefrontal cortex, MPFC; temporoparietal junction, TPJ; amygdala; ventral anterior insula, AI; septal area, SA. Two key regions – the ventral AI and SA – were consistently active across all attentional conditions, suggesting that they are automatically engaged during empathy. In addition, watching versus empathizing with targets was not markedly different and instead led to similar subjective and neural responses to others’ emotional experiences. In contrast, cognitive load reduced the subjective experience of empathy and diminished neural responses in several regions related to empathy (DMPFC, MPFC, TPJ, amygdala and social cognition. The current results reveal how attention impacts empathic processes and provides insight into how empathy may unfold in everyday interactions.

  15. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind.

    Science.gov (United States)

    Nentjes, Lieke; Bernstein, David; Arntz, Arnoud; van Breukelen, Gerard; Slaats, Mariëtte

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in psychopathy. ToM abilities (as assessed with the Reading the Mind in the Eyes Test; RMET; Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), was compared between 39 PCL-R diagnosed psychopathic offenders, 37 non-psychopathic offenders, and 26 nonoffender controls. Contrary to our hypothesis, psychopathic individuals presented with intact overall RMET performance when restrictions were imposed on how long task stimuli could be processed. In addition, psychopaths did not over-ascribe hostility to task stimuli (i.e., lack of hostility bias). However, there was a significant three-way interaction between hostility, processing speed, and psychopathy: when there was no time limit on stimulus presentation, psychopathic offenders made fewer errors in identifying more hostile eye stimuli compared to nonoffender controls, who seemed to be less accurate in detecting hostility. Psychopaths' more realistic appraisal of others' malevolent mental states is discussed in the light of theories that stress its potential adaptive function. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Automatic screening and classification of diabetic retinopathy and maculopathy using fuzzy image processing.

    Science.gov (United States)

    Rahim, Sarni Suhaila; Palade, Vasile; Shuttleworth, James; Jayne, Chrisina

    2016-12-01

    Digital retinal imaging is a challenging screening method for which effective, robust and cost-effective approaches are still to be developed. Regular screening for diabetic retinopathy and diabetic maculopathy diseases is necessary in order to identify the group at risk of visual impairment. This paper presents a novel automatic detection of diabetic retinopathy and maculopathy in eye fundus images by employing fuzzy image processing techniques. The paper first introduces the existing systems for diabetic retinopathy screening, with an emphasis on the maculopathy detection methods. The proposed medical decision support system consists of four parts, namely: image acquisition, image preprocessing including four retinal structures localisation, feature extraction and the classification of diabetic retinopathy and maculopathy. A combination of fuzzy image processing techniques, the Circular Hough Transform and several feature extraction methods are implemented in the proposed system. The paper also presents a novel technique for the macula region localisation in order to detect the maculopathy. In addition to the proposed detection system, the paper highlights a novel online dataset and it presents the dataset collection, the expert diagnosis process and the advantages of our online database compared to other public eye fundus image databases for diabetic retinopathy purposes.

  17. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    Science.gov (United States)

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  18. OConGraX - Automatically Generating Data-Flow Test Cases for Fault-Tolerant Systems

    Science.gov (United States)

    Nunes, Paulo R. F.; Hanazumi, Simone; de Melo, Ana C. V.

    The more complex to develop and manage systems the more software design faults increase, making fault-tolerant systems highly required. To ensure their quality, the normal and exceptional behaviors must be tested and/or verified. Software testing is still a difficult and costly software development task and a reasonable amount of effort has been employed to develop techniques for testing programs’ normal behaviors. For the exceptional behavior, however, there is a lack of techniques and tools to effectively test it. To help in testing and analyzing fault-tolerant systems, we present in this paper a tool that provides an automatic generation of data-flow test cases for objects and exception-handling mechanisms of Java programs and data/control-flow graphs for program analysis.

  19. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  20. Development of a multiplexer for an automatic data acquisition system for the control and monitoring of microbiological cultures

    Energy Technology Data Exchange (ETDEWEB)

    Morales Rondon, A.; Paredes Puente, J.; Arana Alonso, S.

    2016-07-01

    An automatic data acquisition system has been developed for the control and monitoring of microbiological cultures. Turning an otherwise time-consuming process into a smooth one, by allowing the researcher to set the parameters at the beginning of the experiment and move on into the next task. The development of the hardware and software are key to achieving this system. The mux is custom-made with 22 channels, light weight therefore easy to move around the lab. Furthermore, the software allows the researcher to check the measurements in real-time. It is based on virtual instrumentation software therefore new features can be added easily, thus, the mux is capable of adapting to the scientist necessities. (Author)

  1. Improving hole quality by automatic control of the drilling process: theoretical and field studies

    Energy Technology Data Exchange (ETDEWEB)

    Sinkala, T. (Luleaa University of Technology, Luleaa (Sweden). Division of Mining Equipment Engineering)

    1991-01-01

    Some results from studies on hole deviations are discussed. A system which automatically controls drilling parameters during percussion drilling was developed. A procedure for determining the operating magnitudes of drilling parameters for presetting on a drilling machine is demonstrated. Field experiments show that the automatic control system gives less deviations than ordinary drilling systems. 4 refs., 10 figs., 3 tabs.

  2. Automatic 3D high-fidelity traffic interchange modeling using 2D road GIS data

    Science.gov (United States)

    Wang, Jie; Shen, Yuzhong

    2011-03-01

    3D road models are widely used in many computer applications such as racing games and driving simulations. However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially for those existing in the real world. Real road network contains various elements such as road segments, road intersections and traffic interchanges. Among them, traffic interchanges present the most challenges to model due to their complexity and the lack of height information (vertical position) of traffic interchanges in existing road GIS data. This paper proposes a novel approach that can automatically produce 3D high-fidelity road network models, including traffic interchange models, from real 2D road GIS data that mainly contain road centerline information. The proposed method consists of several steps. The raw road GIS data are first preprocessed to extract road network topology, merge redundant links, and classify road types. Then overlapped points in the interchanges are detected and their elevations are determined based on a set of level estimation rules. Parametric representations of the road centerlines are then generated through link segmentation and fitting, and they have the advantages of arbitrary levels of detail with reduced memory usage. Finally a set of civil engineering rules for road design (e.g., cross slope, superelevation) are selected and used to generate realistic road surfaces. In addition to traffic interchange modeling, the proposed method also applies to other more general road elements. Preliminary results show that the proposed method is highly effective and useful in many applications.

  3. Automatic crack detection method for loaded coal in vibration failure process.

    Directory of Open Access Journals (Sweden)

    Chengwu Li

    Full Text Available In the coal mining process, the destabilization of loaded coal mass is a prerequisite for coal and rock dynamic disaster, and surface cracks of the coal and rock mass are important indicators, reflecting the current state of the coal body. The detection of surface cracks in the coal body plays an important role in coal mine safety monitoring. In this paper, a method for detecting the surface cracks of loaded coal by a vibration failure process is proposed based on the characteristics of the surface cracks of coal and support vector machine (SVM. A large number of cracked images are obtained by establishing a vibration-induced failure test system and industrial camera. Histogram equalization and a hysteresis threshold algorithm were used to reduce the noise and emphasize the crack; then, 600 images and regions, including cracks and non-cracks, were manually labelled. In the crack feature extraction stage, eight features of the cracks are extracted to distinguish cracks from other objects. Finally, a crack identification model with an accuracy over 95% was trained by inputting the labelled sample images into the SVM classifier. The experimental results show that the proposed algorithm has a higher accuracy than the conventional algorithm and can effectively identify cracks on the surface of the coal and rock mass automatically.

  4. Automatic and directed search processes in solving simple semantic-memory problems.

    Science.gov (United States)

    Ben-Zur, H

    1989-09-01

    The cognitive processes involved in simple semantic-memory problems were investigated in four experiments. On each trial of Experiments 1 and 2, two stimulus words were presented, with the instructions to find a third word (i.e., the solution) that, when coupled with each of the stimuli, would yield two word pairs used in everyday language (e.g., surprise and birthday, for which the solution is party). The results of the two experiments indicated that informing the subject whether the solution constituted the first or the second element in the word pairs facilitated both likelihood and speed of solution attainment. In addition, solution attainment was relatively high for items based on frequently used word pairs (Experiment 1) and for items in which the stimuli appear, in everyday language, in a small number of word pairs (Experiment 2). In Experiment 3, the subjects were required to produce word pairs containing one of the two stimulus words from the items used in Experiment 2. Solution production was facilitated by rehearsing the second stimulus word of the specific item. The conclusion, supported by a post hoc analysis of the results of Experiments 2 and 3 (Experiment 4), was that indirect priming from one stimulus word may facilitate solution production from a searched word. These results are interpreted in terms of automatic and controlled processes, and their relevance to two different models for retrieval from semantic memory is discussed.

  5. BiobankUniverse: automatic matchmaking between datasets for biobank data discovery and integration.

    Science.gov (United States)

    Pang, Chao; Kelpin, Fleur; van Enckevort, David; Eklund, Niina; Silander, Kaisa; Hendriksen, Dennis; de Haan, Mark; Jetten, Jonathan; de Boer, Tommy; Charbon, Bart; Holub, Petr; Hillege, Hans; Swertz, Morris A

    2017-11-15

    Biobanks are indispensable for large-scale genetic/epidemiological studies, yet it remains difficult for researchers to determine which biobanks contain data matching their research questions. To overcome this, we developed a new matching algorithm that identifies pairs of related data elements between biobanks and research variables with high precision and recall. It integrates lexical comparison, Unified Medical Language System ontology tagging and semantic query expansion. The result is BiobankUniverse, a fast matchmaking service for biobanks and researchers. Biobankers upload their data elements and researchers their desired study variables, BiobankUniverse automatically shortlists matching attributes between them. Users can quickly explore matching potential and search for biobanks/data elements matching their research. They can also curate matches and define personalized data-universes. BiobankUniverse is available at http://biobankuniverse.com or can be downloaded as part of the open source MOLGENIS suite at http://github.com/molgenis/molgenis. m.a.swertz@rug.nl. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. On the Automatic Generation of Plans for Life Cycle Assembly Processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.

    2000-01-01

    Designing products for easy assembly and disassembly during their entire life cycles for purposes including product assembly, product upgrade, product servicing and repair, and product disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and manufacturing plan selection criteria, as compared to initial assembly, require re-visiting significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or to applied studies of life cycle assembly processes that give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for, optimize, and analyze the cycle assembly processes. The study of assembly planning is at the very heart of manufacturing research facilities and academic engineering institutions; and, in recent years a number of significant advances in the field of assembly planning have been made. These advances have ranged from the development of automated assembly planning systems, such as Sandia's Automated Assembly Analysis System Archimedes 3.0{copyright}, to the startling revolution in microprocessors and computer-controlled production tools such as computer-aided design (CAD), computer-aided manufacturing (CAM), flexible manufacturing systems (EMS), and computer-integrated manufacturing (CIM). These results have kindled considerable interest in the study of algorithms for life cycle related assembly processes and have blossomed into a field of intense interest. The intent of this manuscript is to bring together the fundamental results in this area, so that the unifying principles and underlying concepts of algorithm design may more easily be implemented in practice.

  7. INfluence of vinasse on water movement in soil, using automatic acquisition and handling data system

    International Nuclear Information System (INIS)

    Nascimento Filho, V.F. do; Barros Ferraz, E.S. de

    1986-01-01

    The vinasse, by-product of ethylic alcohol industry from the sugar cane juice or molasses yeast fermentation, has been incorporated in the soil as fertilizer, due to the its hight organic matter (2-6%), potassium and sulphate (0,1-0,5%) and other nutrient contents. By employing monoenergetic gamma-ray beam attenuation technique (241Am; 59,5 keV; 100 mCi) the influence of vinasse on the water movement in the soil was studied. For this, an automatic acquisition and handling data system was used, based in multichannel analyser, multi-scaling mode operated, coupled to a personal microcomputer and plotter. Despite the small depth studied (6 cm), it was observed that vinasse decreases the water infiltration velocity in the soil. (Author) [pt

  8. An Investigation of the Stroop Effect among Deaf Signers in English and Japanese: Automatic Processing or Memory Retrieval?

    Science.gov (United States)

    Flaherty, Mary; Moran, Aidan

    2007-01-01

    Most studies on the Stroop effect (unintentional automatic word processing) have been restricted to English speakers using vocal responses. Little is known about this effect with deaf signers. The study compared Stroop task responses among four different samples: deaf participants from a Japanese-language environment and from an English-language…

  9. Application of an automatic yarn dismantler to track changes in cotton fibre properties during processing on a miniature spinning line

    CSIR Research Space (South Africa)

    Fassihi, A

    2014-11-01

    Full Text Available This paper reports on the application of a newly developed automatic yarn dismantler for dismantling short staple ring-spun yarns, to track changes in cotton fibre properties from lint to yarn, during processing on a miniature spinning line...

  10. AN EFFICIENT METHOD FOR AUTOMATIC ROAD EXTRACTION BASED ON MULTIPLE FEATURES FROM LiDAR DATA

    Directory of Open Access Journals (Sweden)

    Y. Li

    2016-06-01

    Full Text Available The road extraction in urban areas is difficult task due to the complicated patterns and many contextual objects. LiDAR data directly provides three dimensional (3D points with less occlusions and smaller shadows. The elevation information and surface roughness are distinguishing features to separate roads. However, LiDAR data has some disadvantages are not beneficial to object extraction, such as the irregular distribution of point clouds and lack of clear edges of roads. For these problems, this paper proposes an automatic road centerlines extraction method which has three major steps: (1 road center point detection based on multiple feature spatial clustering for separating road points from ground points, (2 local principal component analysis with least squares fitting for extracting the primitives of road centerlines, and (3 hierarchical grouping for connecting primitives into complete roads network. Compared with MTH (consist of Mean shift algorithm, Tensor voting, and Hough transform proposed in our previous article, this method greatly reduced the computational cost. To evaluate the proposed method, the Vaihingen data set, a benchmark testing data provided by ISPRS for “Urban Classification and 3D Building Reconstruction” project, was selected. The experimental results show that our method achieve the same performance by less time in road extraction using LiDAR data.

  11. Data taking and processing system for nuclear experimental physics study

    International Nuclear Information System (INIS)

    Nagashima, Y.; Kimura, H.; Katori, K.; Kuriyama, K.

    1979-01-01

    A multi input, multi mode, multi user data taking and processing system was developed. This system has following special features. 1) It is multi computer system which is constitute with two special processors and two mini computers. 2) The pseudo devices are introduced to make operating procedurs simply and easily. Especially, the selection or modification of 1 - 8 coincidence mode can be done very easily and quickly. 3) A 16 Kch spectrum storage has 8 partitions. Every partitions having floating size are handled automatically by the data taking software SHINE. 4) On line real time data processing can be done. Useing the FORTRAN language, user may prepare the processing software apart from the data taking software. Under the RSX-11D system software, this software runs concurrently with the data taking software by a multi programming mode. 5) The data communication between arbitraly external devices and this system can be done. With this communication procedures, not only the data transfer between computers, but also the control of the experimental devices are realized. Like the real time processing software, this software can be prepared by users and be ran concurrently with other softwares. 6) For data monitoring, two different graphic displays are used complementally. One is a refresh typed high speed display. The other is a storage typed large screen display. Raw datas are displayed on the former. Processed datas or multi parametric large volume datas are displayed on the later one. (author)

  12. Segmentation of Multi-Isotope Imaging Mass Spectrometry Data for Semi-Automatic Detection of Regions of Interest

    Science.gov (United States)

    Poczatek, J. Collin; Turck, Christoph W.; Lechene, Claude

    2012-01-01

    Multi-isotope imaging mass spectrometry (MIMS) associates secondary ion mass spectrometry (SIMS) with detection of several atomic masses, the use of stable isotopes as labels, and affiliated quantitative image-analysis software. By associating image and measure, MIMS allows one to obtain quantitative information about biological processes in sub-cellular domains. MIMS can be applied to a wide range of biomedical problems, in particular metabolism and cell fate [1], [2], [3]. In order to obtain morphologically pertinent data from MIMS images, we have to define regions of interest (ROIs). ROIs are drawn by hand, a tedious and time-consuming process. We have developed and successfully applied a support vector machine (SVM) for segmentation of MIMS images that allows fast, semi-automatic boundary detection of regions of interests. Using the SVM, high-quality ROIs (as compared to an expert's manual delineation) were obtained for 2 types of images derived from unrelated data sets. This automation simplifies, accelerates and improves the post-processing analysis of MIMS images. This approach has been integrated into “Open MIMS,” an ImageJ-plugin for comprehensive analysis of MIMS images that is available online at http://www.nrims.hms.harvard.edu/NRIMS_ImageJ.php. PMID:22347386

  13. Runtime Modifications of Spark Data Processing Pipelines

    NARCIS (Netherlands)

    Lazovik, E.; Medema, M.; Albers, T.; Langius, E.A.F.; Lazovik, A.

    2017-01-01

    Distributed data processing systems are the standard means for large-scale data analysis in the Big Data field. These systems are based on processing pipelines where the processing is done via a composition of multiple elements or steps. In current distributed data processing systems, the code and

  14. Automatic processing of wh- and NP-movement in agrammatic aphasia: Evidence from eyetracking

    Science.gov (United States)

    Dickey, Michael Walsh; Thompson, Cynthia K.

    2009-01-01

    Individuals with agrammatic Broca’s aphasia show deficits in comprehension of non-canonical wh-movement and NP-movement sentences. Previous work using eyetracking has found that agrammatic and unimpaired listeners show very similar patterns of automatic processing for wh-movement sentences. The current study attempts to replicate this finding for sentences with wh-movement (in object relatives in the current study) and to extend it to sentences with NP movement (passives). For wh-movement sentences, aphasic and control participants’ eye-movements differed most dramatically in late regions of the sentence and post-offset, with aphasic participants exhibiting lingering attention to a salient but grammatically impermissible competitor. The eye-movement differences between correct and incorrect trials for wh-movement sentences were similar, with incorrect trials also exhibiting competition from an impermissible interpretation late in the sentence. Furthermore, the two groups exhibited similar eye-movement patterns in response to passive NP-movement sentences, but showed little evidence of gap-filling for passives. The results suggest that aphasic and unimpaired individuals may generate similar representations during comprehension, but that aphasics are highly vulnerable to interference from alternative interpretations (Ferreira, 2003). PMID:20161014

  15. Data triggered data processing at MFTF-B

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1985-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory we schedule jobs to process experimental data to be collected during a five minute shot cycle. Our data-driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on our networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. We report here on details of diagnostic data processing and our experiences

  16. Data triggered data processing at the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1986-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory the authors schedule jobs to process experimental data to be collected during a five minute shot cycle. The data driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on the networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. The authors report here on details of diagnostic data processing and their experiences

  17. AntDAS: Automatic Data Analysis Strategy for UPLC-QTOF-Based Nontargeted Metabolic Profiling Analysis.

    Science.gov (United States)

    Fu, Hai-Yan; Guo, Xiao-Ming; Zhang, Yue-Ming; Song, Jing-Jing; Zheng, Qing-Xia; Liu, Ping-Ping; Lu, Peng; Chen, Qian-Si; Yu, Yong-Jie; She, Yuanbin

    2017-10-17

    High-quality data analysis methodology remains a bottleneck for metabolic profiling analysis based on ultraperformance liquid chromatography-quadrupole time-of-flight mass spectrometry. The present work aims to address this problem by proposing a novel data analysis strategy wherein (1) chromatographic peaks in the UPLC-QTOF data set are automatically extracted by using an advanced multiscale Gaussian smoothing-based peak extraction strategy; (2) a peak annotation stage is used to cluster fragment ions that belong to the same compound. With the aid of high-resolution mass spectrometer, (3) a time-shift correction across the samples is efficiently performed by a new peak alignment method; (4) components are registered by using a newly developed adaptive network searching algorithm; (5) statistical methods, such as analysis of variance and hierarchical cluster analysis, are then used to identify the underlying marker compounds; finally, (6) compound identification is performed by matching the extracted peak information, involving high-precision m/z and retention time, against our compound library containing more than 500 plant metabolites. A manually designed mixture of 18 compounds is used to evaluate the performance of the method, and all compounds are detected under various concentration levels. The developed method is comprehensively evaluated by an extremely complex plant data set containing more than 2000 components. Results indicate that the performance of the developed method is comparable with the XCMS. The MATLAB GUI code is available from http://software.tobaccodb.org/software/antdas .

  18. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O’Donovan, Peter; O’Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  19. An Overview of Automaticity and Implications For Training the Thinking Process

    National Research Council Canada - National Science Library

    Holt, Brian

    2002-01-01

    ...., visual search to battlefield thinking). The results of this examination suggest that automaticity can be developed using consistent rules and extensive practice that vary depending on the type of task...

  20. THE AUTOMATIZATION OF SPEAKING SKILLS IN THE PROCESS OF BUILDING AN ACADEMIC ENGLISH PROFESSIONAL COMPETENCE IN SPEAKING OF PROSPECTIVE MARKETERS

    Directory of Open Access Journals (Sweden)

    Аndriana Onufriv

    2016-12-01

    Full Text Available The issue of automatization of speaking skills in the process of building an academic English professional competence in speaking of prospective marketers is highlighted. The essence of an academic English professional competence in speaking of prospective marketers has been analyzed. The structure of an academic English professional competence in speaking of prospective marketers (abilities, skills, knowledge, communicative aptitudes has been studies. Building of speaking skills is based on the acquisition of declarative and procedural knowledge. The presentation is suggested as a leading tool of building an academic English professional competence in speaking of prospective marketers. The ways of automatization of speaking operations in the process of building an academic English professional competence in speaking of prospective marketers has been grounded. It has been established that automatization of speaking operations in the process of building an academic English professional competence  in speaking of prospective marketers occur by the means of developing phonetic, grammar and lexical speaking skills. The automatization   of speaking skills is reached by performing certain exercises and tasks. These assignments are receptive, reproductive ones (by a criterion of leading kind of speaking; warming, stereotype-situational and variant-situational ones (by a criterion of the stages of developing skills; simulative and communicative ones (by a criterion of communication. The most widespread exercise and tasks for developing speaking skills, defined by a criterion of communication, are   non-communicative, simulative and communicative ones.

  1. Towards a data processing plane: An automata-based distributed dynamic data processing model

    NARCIS (Netherlands)

    Cushing, R.; Belloum, A.; Bubak, M.; de Laat, C.

    Data processing complexity, partitionability, locality and provenance play a crucial role in the effectiveness of distributed data processing. Dynamics in data processing necessitates effective modeling which allows the understanding and reasoning of the fluidity of data processing. Through

  2. An algorithm for generating data accessibility recommendations for flight deck Automatic Dependent Surveillance-Broadcast (ADS-B) applications

    Science.gov (United States)

    2014-09-09

    Automatic Dependent Surveillance-Broadcast (ADS-B) In technology supports the display of traffic data on Cockpit Displays of Traffic Information (CDTIs). The data are used by flightcrews to perform defined self-separation procedures, such as the in-t...

  3. Composable Data Processing in Environmental Science - A Process View

    NARCIS (Netherlands)

    Wombacher, Andreas

    Data processing in environmental science is essential for doing science. The heterogeneity of data sources, data processing operations and infrastructures results in a lot of manual data and process integration work done by each scientist individually. This is very inefficient and time consuming.

  4. Development of Web Tools for the automatic Upload of Calibration Data into the CMS Condition Data

    Science.gov (United States)

    di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2010-04-01

    This article explains the recent evolution of Condition Database Application Service. The Condition Database Application Service is part of the condition database system of the CMS experiment, and it is used for handling and monitoring the CMS detector condition data, and the corresponding computing resources like Oracle Databases, storage service and network devices. We deployed a service, the offline Dropbox service, that will be used by Alignment and Calibration Group in order to upload from the offline network (GPN) the calibration constants produced by running offline analysis.

  5. Deep Learning-Based Data Forgery Detection in Automatic Generation Control

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Fengli [Univ. of Arkansas, Fayetteville, AR (United States); Li, Qinghua [Univ. of Arkansas, Fayetteville, AR (United States)

    2017-10-09

    Automatic Generation Control (AGC) is a key control system in the power grid. It is used to calculate the Area Control Error (ACE) based on frequency and tie-line power flow between balancing areas, and then adjust power generation to maintain the power system frequency in an acceptable range. However, attackers might inject malicious frequency or tie-line power flow measurements to mislead AGC to do false generation correction which will harm the power grid operation. Such attacks are hard to be detected since they do not violate physical power system models. In this work, we propose algorithms based on Neural Network and Fourier Transform to detect data forgery attacks in AGC. Different from the few previous work that rely on accurate load prediction to detect data forgery, our solution only uses the ACE data already available in existing AGC systems. In particular, our solution learns the normal patterns of ACE time series and detects abnormal patterns caused by artificial attacks. Evaluations on the real ACE dataset show that our methods have high detection accuracy.

  6. Automatic cardiac gating of small-animal PET from list-mode data

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L.; Udias, J.M. [Universidad Complutense de Madrid Univ. (Spain). Grupo de Fisica Nuclear; Vaquero, J.J.; Desco, M. [Universidad Carlos III de Madrid (Spain). Dept. de Bioingenieria e Ingenieria Aeroespacial; Cusso, L. [Hospital General Universitario Gregorio Maranon, Madrid (Spain). Unidad de Medicina y Cirugia Experimental

    2011-07-01

    This work presents a method to obtain automatically the cardiac gating signal in a PET study of rats, by employing the variation with time of the counts in the cardiac region, that can be extracted from list-mode data. In an initial step, the cardiac region is identified in the image space by backward-projecting a small fraction of the acquired data and studying the variation with time of the counts in each voxel inside said region, with frequencies within 2 and 8 Hz. The region obtained corresponds accurately to the left-ventricle of the heart of the rat. In a second step, the lines-of-response (LORs) connected with this region are found by forward-projecting this region. The time variation of the number of counts in these LORs contains the cardiac motion information that we want to extract. This variation of counts with time is band-pass filtered to reduce noise, and the time signal so obtained is used to create the gating signal. The result was compared with a cardiac gating signal obtained from an ECG acquired simultaneously to the PET study. Reconstructed gated images obtained from both gating information are similar. The method proposed demonstrates that valid cardiac gating signals can be obtained for rats from PET list-mode data. (orig.)

  7. Chemical name extraction based on automatic training data generation and rich feature set.

    Science.gov (United States)

    Yan, Su; Spangler, W Scott; Chen, Ying

    2013-01-01

    The automation of extracting chemical names from text has significant value to biomedical and life science research. A major barrier in this task is the difficulty of getting a sizable and good quality data to train a reliable entity extraction model. Another difficulty is the selection of informative features of chemical names, since comprehensive domain knowledge on chemistry nomenclature is required. Leveraging random text generation techniques, we explore the idea of automatically creating training sets for the task of chemical name extraction. Assuming the availability of an incomplete list of chemical names, called a dictionary, we are able to generate well-controlled, random, yet realistic chemical-like training documents. We statistically analyze the construction of chemical names based on the incomplete dictionary, and propose a series of new features, without relying on any domain knowledge. Compared to state-of-the-art models learned from manually labeled data and domain knowledge, our solution shows better or comparable results in annotating real-world data with less human effort. Moreover, we report an interesting observation about the language for chemical names. That is, both the structural and semantic components of chemical names follow a Zipfian distribution, which resembles many natural languages.

  8. SU-D-BRD-07: Automatic Patient Data Audit and Plan Quality Check to Support ARIA and Eclipse

    Energy Technology Data Exchange (ETDEWEB)

    Li, X; Li, H; Wu, Y; Mutic, S; Yang, D [Washington University School of Medicine, St. Louis, MO (United States)

    2014-06-01

    Purpose: To ensure patient safety and treatment quality in RT departments that use Varian ARIA and Eclipse, we developed a computer software system and interface functions that allow previously developed electron chart checking (EcCk) methodologies to support these Varian systems. Methods: ARIA and Eclipse store most patient information in its MSSQL database. We studied the contents in the hundreds database tables and identified the data elements used for patient treatment management and treatment planning. Interface functions were developed in both c-sharp and MATLAB to support data access from ARIA and Eclipse servers using SQL queries. These functions and additional data processing functions allowed the existing rules and logics from EcCk to support ARIA and Eclipse. Dose and structure information are important for plan quality check, however they are not stored in the MSSQL database but as files in Varian private formats, and cannot be processed by external programs. We have therefore implemented a service program, which uses the DB Daemon and File Daemon services on ARIA server to automatically and seamlessly retrieve dose and structure data as DICOM files. This service was designed to 1) consistently monitor the data access requests from EcCk programs, 2) translate the requests for ARIA daemon services to obtain dose and structure DICOM files, and 3) monitor the process and return the obtained DICOM files back to EcCk programs for plan quality check purposes. Results: EcCk, which was previously designed to only support MOSAIQ TMS and Pinnacle TPS, can now support Varian ARIA and Eclipse. The new EcCk software has been tested and worked well in physics new start plan check, IMRT plan integrity and plan quality checks. Conclusion: Methods and computer programs have been implemented to allow EcCk to support Varian ARIA and Eclipse systems. This project was supported by a research grant from Varian Medical System.

  9. Automatic processing of semantic relations in fMRI: neural activation during semantic priming of taxonomic and thematic categories.

    Science.gov (United States)

    Sachs, Olga; Weis, Susanne; Zellagui, Nadia; Huber, Walter; Zvyagintsev, Mikhail; Mathiak, Klaus; Kircher, Tilo

    2008-07-07

    Most current models of knowledge organization are based on hierarchical or taxonomic categories (animals, tools). Another important organizational pattern is thematic categorization, i.e. categories held together by external relations, a unifying scene or event (car and garage). The goal of this study was to compare the neural correlates of these categories under automatic processing conditions that minimize strategic influences. We used fMRI to examine neural correlates of semantic priming for category members with a short stimulus onset asynchrony (SOA) of 200 ms as subjects performed a lexical decision task. Four experimental conditions were compared: thematically related words (car-garage); taxonomically related (car-bus); unrelated (car-spoon); non-word trials (car-derf). We found faster reaction times for related than for unrelated prime-target pairs for both thematic and taxonomic categories. However, the size of the thematic priming effect was greater than that of the taxonomic. The imaging data showed signal changes for the taxonomic priming effects in the right precuneus, postcentral gyrus, middle frontal and superior frontal gyri and thematic priming effects in the right middle frontal gyrus and anterior cingulate. The contrast of neural priming effects showed larger signal changes in the right precuneus associated with the taxonomic but not with thematic priming response. We suggest that the greater involvement of precuneus in the processing of taxonomic relations indicates their reduced salience in the knowledge structure compared to more prominent thematic relations.

  10. 13C-detected NMR experiments for automatic resonance assignment of IDPs and multiple-fixing SMFT processing

    International Nuclear Information System (INIS)

    Dziekański, Paweł; Grudziąż, Katarzyna; Jarvoll, Patrik; Koźmiński, Wiktor; Zawadzka-Kazimierczuk, Anna

    2015-01-01

    Intrinsically disordered proteins (IDPs) have recently attracted much interest, due to their role in many biological processes, including signaling and regulation mechanisms. High-dimensional 13 C direct-detected NMR experiments have proven exceptionally useful in case of IDPs, providing spectra with superior peak dispersion. Here, two such novel experiments recorded with non-uniform sampling are introduced, these are 5D HabCabCO(CA)NCO and 5D HNCO(CA)NCO. Together with the 4D (HACA)CON(CA)NCO, an extension of the previously published 3D experiments (Pantoja-Uceda and Santoro in J Biomol NMR 59:43–50, 2014. doi: 10.1007/s10858-014-9827-1 10.1007/s10858-014-9827-1 ), they form a set allowing for complete and reliable resonance assignment of difficult IDPs. The processing is performed with sparse multidimensional Fourier transform based on the concept of restricting (fixing) some of spectral dimensions to a priori known resonance frequencies. In our study, a multiple-fixing method was developed, that allows easy access to spectral data. The experiments were tested on a resolution-demanding alpha-synuclein sample. Due to superior peak dispersion in high-dimensional spectrum and availability of the sequential connectivities between four consecutive residues, the overwhelming majority of resonances could be assigned automatically using the TSAR program

  11. Processing multidimensional nuclear physics data

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Modern Ge detector arrays for gamma-ray spectroscopy are producing data sets unprecedented in size and event multiplicity. Gammasphere, the DOE sponsored array, has the following characteristics: (1) High granularity (110 detectors); (2) High efficiency (10%); and (3) Precision energy measurements (Delta EE = 0.2%). Characteristics of detector line shape, the data set, and the standard practice in the nuclear physics community to the nuclear gamma-ray cascades from the 4096 times 4096 times 4096 data cube will be discussed.

  12. Automatic calibration of a global flow routing model in the Amazon basin using virtual SWOT data

    Science.gov (United States)

    Rogel, P. Y.; Mouffe, M.; Getirana, A.; Ricci, S. M.; Lion, C.; Mognard, N. M.; Biancamaria, S.; Boone, A.

    2012-12-01

    The Surface Water and Ocean Topography (SWOT) wide swath altimetry mission will provide a global coverage of surface water elevation, which will be used to help correct water height and discharge prediction from hydrological models. Here, the aim is to investigate the use of virtually generated SWOT data to improve water height and discharge simulation using calibration of model parameters (like river width, river depth and roughness coefficient). In this work, we use the HyMAP model to estimate water height and discharge on the Amazon catchment area. Before reaching the river network, surface and subsurface runoff are delayed by a set of linear and independent reservoirs. The flow routing is performed by the kinematic wave equation.. Since the SWOT mission has not yet been launched, virtual SWOT data are generated with a set of true parameters for HyMAP as well as measurement errors from a SWOT data simulator (i.e. a twin experiment approach is implemented). These virtual observations are used to calibrate key parameters of HyMAP through the minimization of a cost function defining the difference between the simulated and observed water heights over a one-year simulation period. The automatic calibration procedure is achieved using the MOCOM-UA multicriteria global optimization algorithm as well as the local optimization algorithm BC-DFO that is considered as a computational cost saving alternative. First, to reduce the computational cost of the calibration procedure, each spatially distributed parameter (Manning coefficient, river width and river depth) is corrupted through the multiplication of a spatially uniform factor that is the only factor optimized. In this case, it is shown that, when the measurement errors are small, the true water heights and discharges are easily retrieved. Because of equifinality, the true parameters are not always identified. A spatial correction of the model parameters is then investigated and the domain is divided into 4 regions

  13. Fast processing the film data file

    International Nuclear Information System (INIS)

    Abramov, B.M.; Avdeev, N.F.; Artemov, A.V.

    1978-01-01

    The problems of processing images obtained from three-meter magnetic spectrometer on a new PSP-2 automatic device are considered. A detailed description of the filtration program, which controls the correctness of operation connection line, as well as of scanning parameters and technical quality of information. The filtration process can be subdivided into the following main stages: search of fiducial marks binding of track to fiducial marks; plotting from sparks of track fragments in chambers. For filtration purposes the BESM-6 computer has been chosen. The complex of filtration programs is shaped as a RAM-file, the required version of the program is collected by the PATCHY program. The subprograms, performing the greater part of the calculations are written in the autocode MADLEN, the rest of the subprograms - in FORTRAN and ALGOL. The filtration time for one image makes 1,2-2 s of the calculation. The BESM-6 computer processes up to 12 thousand images a day

  14. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    Science.gov (United States)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  15. Simple Approaches to Improve the Automatic Inventory of ZEBRA Crossing from Mls Data

    Science.gov (United States)

    Arias, P.; Riveiro, B.; Soilán, M.; Díaz-Vilariño, L.; Martínez-Sánchez, J.

    2015-08-01

    The city management is increasingly supported by information technologies, leading to paradigms such as smart cities, where decision-makers, companies and citizens are continuously interconnected. 3D modelling turns of great relevance when the city has to be managed making use of geospatial databases or Geographic Information Systems. On the other hand, laser scanning technology has experienced a significant growth in the last years, and particularly, terrestrial mobile laser scanning platforms are being more and more used with inventory purposes in both cities and road environments. Consequently, large datasets are available to produce the geometric basis for the city model; however, this data is not directly exploitable by management systems constraining the implementation of the technology for such applications. This paper presents a new algorithm for the automatic detection of zebra crossing. The algorithm is divided in three main steps: road segmentation (based on a PCA analysis of the points contained in each cycle of collected by a mobile laser system), rasterization (conversion of the point cloud to a raster image coloured as a function of intensity data), and zebra crossing detection (using the Hough Transform and logical constrains for line classification). After evaluating different datasets collected in three cities located in Northwest Spain (comprising 25 strips with 30 visible zebra crossings) a completeness of 83% was achieved.

  16. Automatic extraction of property norm-like data from large text corpora.

    Science.gov (United States)

    Kelly, Colin; Devereux, Barry; Korhonen, Anna

    2014-01-01

    Traditional methods for deriving property-based representations of concepts from text have focused on either extracting only a subset of possible relation types, such as hyponymy/hypernymy (e.g., car is-a vehicle) or meronymy/metonymy (e.g., car has wheels), or unspecified relations (e.g., car--petrol). We propose a system for the challenging task of automatic, large-scale acquisition of unconstrained, human-like property norms from large text corpora, and discuss the theoretical implications of such a system. We employ syntactic, semantic, and encyclopedic information to guide our extraction, yielding concept-relation-feature triples (e.g., car be fast, car require petrol, car cause pollution), which approximate property-based conceptual representations. Our novel method extracts candidate triples from parsed corpora (Wikipedia and the British National Corpus) using syntactically and grammatically motivated rules, then reweights triples with a linear combination of their frequency and four statistical metrics. We assess our system output in three ways: lexical comparison with norms derived from human-generated property norm data, direct evaluation by four human judges, and a semantic distance comparison with both WordNet similarity data and human-judged concept similarity ratings. Our system offers a viable and performant method of plausible triple extraction: Our lexical comparison shows comparable performance to the current state-of-the-art, while subsequent evaluations exhibit the human-like character of our generated properties.

  17. Estimating Train Choices of Rail Transit Passengers with Real Timetable and Automatic Fare Collection Data

    Directory of Open Access Journals (Sweden)

    Wei Zhu

    2017-01-01

    Full Text Available An urban rail transit (URT system is operated according to relatively punctual schedule, which is one of the most important constraints for a URT passenger’s travel. Thus, it is the key to estimate passengers’ train choices based on which passenger route choices as well as flow distribution on the URT network can be deduced. In this paper we propose a methodology that can estimate individual passenger’s train choices with real timetable and automatic fare collection (AFC data. First, we formulate the addressed problem using Manski’s paradigm on modelling choice. Then, an integrated framework for estimating individual passenger’s train choices is developed through a data-driven approach. The approach links each passenger trip to the most feasible train itinerary. Initial case study on Shanghai metro shows that the proposed approach works well and can be further used for deducing other important operational indicators like route choices, passenger flows on section, load factor of train, and so forth.

  18. An Extensible Processing Framework for Eddy-covariance Data

    Science.gov (United States)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  19. Digital curation: a proposal of a semi-automatic digital object selection-based model for digital curation in Big Data environments

    Directory of Open Access Journals (Sweden)

    Moisés Lima Dutra

    2016-08-01

    Full Text Available Introduction: This work presents a new approach for Digital Curations from a Big Data perspective. Objective: The objective is to propose techniques to digital curations for selecting and evaluating digital objects that take into account volume, velocity, variety, reality, and the value of the data collected from multiple knowledge domains. Methodology: This is an exploratory research of applied nature, which addresses the research problem in a qualitative way. Heuristics allow this semi-automatic process to be done either by human curators or by software agents. Results: As a result, it was proposed a model for searching, processing, evaluating and selecting digital objects to be processed by digital curations. Conclusions: It is possible to use Big Data environments as a source of information resources for Digital Curation; besides, Big Data techniques and tools can support the search and selection process of information resources by Digital Curations.

  20. Parallel processing of genomics data

    Science.gov (United States)

    Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-10-01

    The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.

  1. Towards a Modernization Process for Secure Data Warehouses

    Science.gov (United States)

    Blanco, Carlos; Pérez-Castillo, Ricardo; Hernández, Arnulfo; Fernández-Medina, Eduardo; Trujillo, Juan

    Data Warehouses (DW) manage crucial enterprise information used for the decision making process which has to be protected from unauthorized accesses. However, security constraints are not properly integrated in the complete DWs’ development process, being traditionally considered in the last stages. Furthermore, legacy systems need a reverse engineering process in order to accomplish re-documentation for detecting new security requirements as well as system’s design recovery to enable migration and reuse. Thus, we have proposed a model driven architecture (MDA) for secure DWs which takes into account security issues from the early stages of development and provides automatic transformations between models. This paper fulfills this architecture providing an architecture-driven modernization (ADM) process focused on obtaining conceptual security models from legacy OLAP systems.

  2. Automatic detection and agronomic characterization of olive groves using high-resolution imagery and LIDAR data

    Science.gov (United States)

    Caruso, T.; Rühl, J.; Sciortino, R.; Marra, F. P.; La Scalia, G.

    2014-10-01

    The Common Agricultural Policy of the European Union grants subsidies for olive production. Areas of intensified olive farming will be of major importance for the increasing demand for oil production of the next decades, and countries with a high ratio of intensively and super-intensively managed olive groves will be more competitive than others, since they are able to reduce production costs. It can be estimated that about 25-40% of the Sicilian oliviculture must be defined as "marginal". Modern olive cultivation systems, which permit the mechanization of pruning and harvest operations, are limited. Agronomists, landscape planners, policy decision-makers and other professionals have a growing need for accurate and cost-effective information on land use in general and agronomic parameters in the particular. The availability of high spatial resolution imagery has enabled researchers to propose analysis tools on agricultural parcel and tree level. In our study, we test the performance of WorldView-2 imagery relative to the detection of olive groves and the delineation of olive tree crowns, using an object-oriented approach of image classification in combined use with LIDAR data. We selected two sites, which differ in their environmental conditions and in their agronomic parameters of olive grove cultivation. The main advantage of the proposed methodology is the low necessary quantity of data input and its automatibility. However, it should be applied in other study areas to test if the good results of accuracy assessment can be confirmed. Data extracted by the proposed methodology can be used as input data for decision-making support systems for olive grove management.

  3. X-ray data processing

    OpenAIRE

    Powell, Harold R.

    2017-01-01

    The method of molecular structure determination by X-ray crystallography is a little over a century old. The history is described briefly, along with developments in X-ray sources and detectors. The fundamental processes involved in measuring diffraction patterns on area detectors, i.e. autoindexing, refining crystal and detector parameters, integrating the reflections themselves and putting the resultant measurements on to a common scale are discussed, with particular reference to the most c...

  4. Intelligent earthquake data processing for global adjoint tomography

    Science.gov (United States)

    Chen, Y.; Hill, J.; Li, T.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Due to the increased computational capability afforded by modern and future computing architectures, the seismology community is demanding a more comprehensive understanding of the full waveform information from the recorded earthquake seismograms. Global waveform tomography is a complex workflow that matches observed seismic data with synthesized seismograms by iteratively updating the earth model parameters based on the adjoint state method. This methodology allows us to compute a very accurate model of the earth's interior. The synthetic data is simulated by solving the wave equation in the entire globe using a spectral-element method. In order to ensure the inversion accuracy and stability, both the synthesized and observed seismograms must be carefully pre-processed. Because the scale of the inversion problem is extremely large and there is a very large volume of data to both be read and written, an efficient and reliable pre-processing workflow must be developed. We are investigating intelligent algorithms based on a machine-learning (ML) framework that will automatically tune parameters for the data processing chain. One straightforward application of ML in data processing is to classify all possible misfit calculation windows into usable and unusable ones, based on some intelligent ML models such as neural network, support vector machine or principle component analysis. The intelligent earthquake data processing framework will enable the seismology community to compute the global waveform tomography using seismic data from an arbitrarily large number of earthquake events in the fastest, most efficient way.

  5. Processing data base information having nonwhite noise

    Science.gov (United States)

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  6. X-ray data processing.

    Science.gov (United States)

    Powell, Harold R

    2017-10-31

    The method of molecular structure determination by X-ray crystallography is a little over a century old. The history is described briefly, along with developments in X-ray sources and detectors. The fundamental processes involved in measuring diffraction patterns on area detectors, i.e. autoindexing, refining crystal and detector parameters, integrating the reflections themselves and putting the resultant measurements on to a common scale are discussed, with particular reference to the most commonly used software in the field. © 2017 The Author(s).

  7. Automatic registration of panoramic image sequence and mobile laser scanning data using semantic features

    Science.gov (United States)

    Li, Jianping; Yang, Bisheng; Chen, Chi; Huang, Ronggang; Dong, Zhen; Xiao, Wen

    2018-02-01

    Inaccurate exterior orientation parameters (EoPs) between sensors obtained by pre-calibration leads to failure of registration between panoramic image sequence and mobile laser scanning data. To address this challenge, this paper proposes an automatic registration method based on semantic features extracted from panoramic images and point clouds. Firstly, accurate rotation parameters between the panoramic camera and the laser scanner are estimated using GPS and IMU aided structure from motion (SfM). The initial EoPs of panoramic images are obtained at the same time. Secondly, vehicles in panoramic images are extracted by the Faster-RCNN as candidate primitives to be matched with potential corresponding primitives in point clouds according to the initial EoPs. Finally, translation between the panoramic camera and the laser scanner is refined by maximizing the overlapping area of corresponding primitive pairs based on the Particle Swarm Optimization (PSO), resulting in a finer registration between panoramic image sequences and point clouds. Two challenging urban scenes were experimented to assess the proposed method, and the final registration errors of these two scenes were both less than three pixels, which demonstrates a high level of automation, robustness and accuracy.

  8. AUTOMATIC THICKNESS AND VOLUME ESTIMATION OF SPRAYED CONCRETE ON ANCHORED RETAINING WALLS FROM TERRESTRIAL LIDAR DATA

    Directory of Open Access Journals (Sweden)

    J. Martínez-Sánchez

    2016-06-01

    Full Text Available When ground conditions are weak, particularly in free formed tunnel linings or retaining walls, sprayed concrete can be applied on the exposed surfaces immediately after excavation for shotcreting rock outcrops. In these situations, shotcrete is normally applied conjointly with rock bolts and mesh, thereby supporting the loose material that causes many of the small ground falls. On the other hand, contractors want to determine the thickness and volume of sprayed concrete for both technical and economic reasons: to guarantee their structural strength but also, to not deliver excess material that they will not be paid for. In this paper, we first introduce a terrestrial LiDAR-based method for the automatic detection of rock bolts, as typically used in anchored retaining walls. These ground support elements are segmented based on their geometry and they will serve as control points for the co-registration of two successive scans, before and after shotcreting. Then we compare both point clouds to estimate the sprayed concrete thickness and the expending volume on the wall. This novel methodology is demonstrated on repeated scan data from a retaining wall in the city of Vigo (Spain, resulting in a rock bolts detection rate of 91%, that permits to obtain a detailed information of the thickness and calculate a total volume of 3597 litres of concrete. These results have verified the effectiveness of the developed approach by increasing productivity and improving previous empirical proposals for real time thickness estimation.

  9. The effect of a low-speed automatic brake system estimated from real life data.

    Science.gov (United States)

    Isaksson-Hellman, Irene; Lindman, Magdalena

    2012-01-01

    A substantial part of all traffic accidents involving passenger cars are rear-end collisions and most of them occur at low speed. Auto Brake is a feature that has been launched in several passenger car models during the last few years. City Safety is a technology designed to help the driver mitigate, and in certain situations avoid, rear-end collisions at low speed by automatically braking the vehicle.Studies have been presented that predict promising benefits from these kinds of systems, but few attempts have been made to show the actual effect of Auto Brake. In this study, the effect of City Safety, a standard feature on the Volvo XC60 model, is calculated based on insurance claims data from cars in real traffic crashes in Sweden. The estimated claim frequency of rear-end frontal collisions measured in claims per 1,000 insured vehicle years was 23% lower for the City Safety equipped XC60 model than for other Volvo models without the system.

  10. Automatic Censoring CFAR Detector Based on Ordered Data Difference for Low-Flying Helicopter Safety

    Directory of Open Access Journals (Sweden)

    Wen Jiang

    2016-07-01

    Full Text Available Being equipped with a millimeter-wave radar allows a low-flying helicopter to sense the surroundings in real time, which significantly increases its safety. However, nonhomogeneous clutter environments, such as a multiple target situation and a clutter edge environment, can dramatically affect the radar signal detection performance. In order to improve the radar signal detection performance in nonhomogeneous clutter environments, this paper proposes a new automatic censored cell averaging CFAR detector. The proposed CFAR detector does not require any prior information about the background environment and uses the hypothesis test of the first-order difference (FOD result of ordered data to reject the unwanted samples in the reference window. After censoring the unwanted ranked cells, the remaining samples are combined to form an estimate of the background power level, thus getting better radar signal detection performance. The simulation results show that the FOD-CFAR detector provides low loss CFAR performance in a homogeneous environment and also performs robustly in nonhomogeneous environments. Furthermore, the measured results of a low-flying helicopter validate the basic performance of the proposed method.

  11. Automatic Censoring CFAR Detector Based on Ordered Data Difference for Low-Flying Helicopter Safety.

    Science.gov (United States)

    Jiang, Wen; Huang, Yulin; Yang, Jianyu

    2016-07-08

    Being equipped with a millimeter-wave radar allows a low-flying helicopter to sense the surroundings in real time, which significantly increases its safety. However, nonhomogeneous clutter environments, such as a multiple target situation and a clutter edge environment, can dramatically affect the radar signal detection performance. In order to improve the radar signal detection performance in nonhomogeneous clutter environments, this paper proposes a new automatic censored cell averaging CFAR detector. The proposed CFAR detector does not require any prior information about the background environment and uses the hypothesis test of the first-order difference (FOD) result of ordered data to reject the unwanted samples in the reference window. After censoring the unwanted ranked cells, the remaining samples are combined to form an estimate of the background power level, thus getting better radar signal detection performance. The simulation results show that the FOD-CFAR detector provides low loss CFAR performance in a homogeneous environment and also performs robustly in nonhomogeneous environments. Furthermore, the measured results of a low-flying helicopter validate the basic performance of the proposed method.

  12. Semi-Automatic Detection of Swimming Pools from Aerial High-Resolution Images and LIDAR Data

    Directory of Open Access Journals (Sweden)

    Borja Rodríguez-Cuenca

    2014-03-01

    Full Text Available Bodies of water, particularly swimming pools, are land covers of high interest. Their maintenance involves energy costs that authorities must take into consideration. In addition, swimming pools are important water sources for firefighting. However, they also provide a habitat for mosquitoes to breed, potentially posing a serious health threat of mosquito-borne disease. This paper presents a novel semi-automatic method of detecting swimming pools in urban environments from aerial images and LIDAR data. A new index for detecting swimming pools is presented (Normalized Difference Swimming Pools Index that is combined with three other decision indices using the Dempster–Shafer theory to determine the locations of swimming pools. The proposed method was tested in an urban area of the city of Alcalá de Henares in Madrid, Spain. The method detected all existing swimming pools in the studied area with an overall accuracy of 99.86%, similar to the results obtained by support vector machines (SVM supervised classification.

  13. The Effect of a Low-Speed Automatic Brake System Estimated From Real Life Data

    Science.gov (United States)

    Isaksson-Hellman, Irene; Lindman, Magdalena

    2012-01-01

    A substantial part of all traffic accidents involving passenger cars are rear-end collisions and most of them occur at low speed. Auto Brake is a feature that has been launched in several passenger car models during the last few years. City Safety is a technology designed to help the driver mitigate, and in certain situations avoid, rear-end collisions at low speed by automatically braking the vehicle. Studies have been presented that predict promising benefits from these kinds of systems, but few attempts have been made to show the actual effect of Auto Brake. In this study, the effect of City Safety, a standard feature on the Volvo XC60 model, is calculated based on insurance claims data from cars in real traffic crashes in Sweden. The estimated claim frequency of rear-end frontal collisions measured in claims per 1,000 insured vehicle years was 23% lower for the City Safety equipped XC60 model than for other Volvo models without the system. PMID:23169133

  14. Front-end data processing the SLD data acquisition system

    International Nuclear Information System (INIS)

    Nielsen, B.S.

    1986-07-01

    The data acquisition system for the SLD detector will make extensive use of parallel at the front-end level. Fastbus acquisition modules are being built with powerful processing capabilities for calibration, data reduction and further pre-processing of the large amount of analog data handled by each module. This paper describes the read-out electronics chain and data pre-processing system adapted for most of the detector channels, exemplified by the central drift chamber waveform digitization and processing system

  15. Review of Automatic Feature Extraction from High-Resolution Optical Sensor Data for UAV-Based Cadastral Mapping

    OpenAIRE

    Sophie Crommelinck; Rohan Bennett; Markus Gerke; Francesco Nex; Michael Ying Yang; George Vosselman

    2016-01-01

    Unmanned Aerial Vehicles (UAVs) have emerged as a rapid, low-cost and flexible acquisition system that appears feasible for application in cadastral mapping: high-resolution imagery, acquired using UAVs, enables a new approach for defining property boundaries. However, UAV-derived data are arguably not exploited to its full potential: based on UAV data, cadastral boundaries are visually detected and manually digitized. A workflow that automatically extracts boundary features from UAV data cou...

  16. Optical Data Processing in Europe,

    Science.gov (United States)

    1981-12-31

    conjugate reflection optics and Faraday isolator cells were fabricated and used to separate different multiple fringe patterns present. Recording of ellip...statistical optical pattern rec- ognition. Duvernoy has used K-L transforms and chromatic filtering with a hyperspace analysis to do data clustering...polarized input light, and the small (40 Hz) frequency bandpass (this seems to be the major problems). The device tested had 80 Um cells with 10 pm

  17. Instrumental development and data processing

    International Nuclear Information System (INIS)

    Franzen, J.

    1978-01-01

    A review of recent developments in mass spectrometry instrumentation is presented under the following headings: introduction (scope of mass spectrometry compared with neighbouring fields); ion sources and ionization techniques; spectrometers (instrumental developments); measuring procedures; coupling techniques; data systems; conclusions (that mass spectrometry should have a broader basis and that there would be mutual profit from a better penetration of mass spectrometry into fields of routine application). (U.K.)

  18. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  19. Automatic Service Derivation from Business Process Model Repositories via Semantic Technology

    NARCIS (Netherlands)

    Leopold, H.; Pittke, F.; Mendling, J.

    2015-01-01

    Although several approaches for service identification have been defined in research and practice, there is a notable lack of fully automated techniques. In this paper, we address the problem of manual work in the context of service derivation and present an approach for automatically deriving

  20. Development of automatic radiographic inspection system using digital image processing and artificial intelligence

    International Nuclear Information System (INIS)

    Itoga, Kouyu; Sugimoto, Koji; Michiba, Koji; Kato, Yuhei; Sugita, Yuji; Onda, Katsuhiro.

    1991-01-01

    The application of computers to welding inspection is expanding rapidly. The classification of the application is the collection, analysis and processing of data, the graphic display of results, the distinction of the kinds of defects and the evaluation of the harmufulness of defects and the judgement of acceptance or rejection. The application of computer techniques to the automation of data collection was realized at the relatively early stage. Data processing and the graphic display of results are the techniques in progress now, and the application of artificial intelligence to the distinction of the kinds of defects and the evaluation of harmfulness is expected to expand rapidly. In order to computerize radiographic inspection, the abilities of image processing technology and knowledge engineering must be given to computers. The object of this system is the butt joints by arc welding of the steel materials of up to 30 mm thickness. The digitizing transformation of radiographs, the distinction and evaluation of transmissivity and gradation by image processing, and only as for those, of which the picture quality satisfies the standard, the extraction of defect images, their display, the distinction of the kinds and the final judgement are carried out. The techniques of image processing, the knowledge for distinguishing the kinds of defects and the concept of the practical system are reported. (K.I.)

  1. Automatic Sky View Factor Estimation from Street View Photographs—A Big Data Approach

    Directory of Open Access Journals (Sweden)

    Jianming Liang

    2017-04-01

    Full Text Available Hemispherical (fisheye photography is a well-established approach for estimating the sky view factor (SVF. High-resolution urban models from LiDAR and oblique airborne photogrammetry can provide continuous SVF estimates over a large urban area, but such data are not always available and are difficult to acquire. Street view panoramas have become widely available in urban areas worldwide: Google Street View (GSV maintains a global network of panoramas excluding China and several other countries; Baidu Street View (BSV and Tencent Street View (TSV focus their panorama acquisition efforts within China, and have covered hundreds of cities therein. In this paper, we approach this issue from a big data perspective by presenting and validating a method for automatic estimation of SVF from massive amounts of street view photographs. Comparisons were made with SVF estimates derived from two independent sources: a LiDAR-based Digital Surface Model (DSM and an oblique airborne photogrammetry-based 3D city model (OAP3D, resulting in a correlation coefficient of 0.863 and 0.987, respectively. The comparisons demonstrated the capacity of the proposed method to provide reliable SVF estimates. Additionally, we present an application of the proposed method with about 12,000 GSV panoramas to characterize the spatial distribution of SVF over Manhattan Island in New York City. Although this is a proof-of-concept study, it has shown the potential of the proposed approach to assist urban climate and urban planning research. However, further development is needed before this approach can be finally delivered to the urban climate and urban planning communities for practical applications.

  2. Image processing applied to automatic detection of defects during ultrasonic examination; Imagerie numerique ultrasonore pour la detection automatique de defauts en controle non destructif

    Energy Technology Data Exchange (ETDEWEB)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces.

  3. Cart'Eaux: an automatic mapping procedure for wastewater networks using machine learning and data mining

    Science.gov (United States)

    Bailly, J. S.; Delenne, C.; Chahinian, N.; Bringay, S.; Commandré, B.; Chaumont, M.; Derras, M.; Deruelle, L.; Roche, M.; Rodriguez, F.; Subsol, G.; Teisseire, M.

    2017-12-01

    In France, local government institutions must establish a detailed description of wastewater networks. The information should be available, but it remains fragmented (different formats held by different stakeholders) and incomplete. In the "Cart'Eaux" project, a multidisciplinary team, including an industrial partner, develops a global methodology using Machine Learning and Data Mining approaches applied to various types of large data to recover information in the aim of mapping urban sewage systems for hydraulic modelling. Deep-learning is first applied using a Convolution Neural Network to localize manhole covers on 5 cm resolution aerial RGB images. The detected manhole covers are then automatically connected using a tree-shaped graph constrained by industry rules. Based on a Delaunay triangulation, connections are chosen to minimize a cost function depending on pipe length, slope and possible intersection with roads or buildings. A stochastic version of this algorithm is currently being developed to account for positional uncertainty and detection errors, and generate sets of probable networks. As more information is required for hydraulic modeling (slopes, diameters, materials, etc.), text data mining is used to extract network characteristics from data posted on the Web or available through governmental or specific databases. Using an appropriate list of keywords, the web is scoured for documents which are saved in text format. The thematic entities are identified and linked to the surrounding spatial and temporal entities. The methodology is developed and tested on two towns in southern France. The primary results are encouraging: 54% of manhole covers are detected with few false detections, enabling the reconstruction of probable networks. The data mining results are still being investigated. It is clear at this stage that getting numerical values on specific pipes will be challenging. Thus, when no information is found, decision rules will be used to

  4. A prototype for JDEM science data processing

    International Nuclear Information System (INIS)

    Gottschalk, Erik E

    2011-01-01

    Fermilab is developing a prototype science data processing and data quality monitoring system for dark energy science. The purpose of the prototype is to demonstrate distributed data processing capabilities for astrophysics applications, and to evaluate candidate technologies for trade-off studies. We present the architecture and technical aspects of the prototype, including an open source scientific execution and application development framework, distributed data processing, and publish/subscribe message passing for quality control.

  5. Algebraic Meta-Theory of Processes with Data

    Directory of Open Access Journals (Sweden)

    Daniel Gebler

    2013-07-01

    Full Text Available There exists a rich literature of rule formats guaranteeing different algebraic properties for formalisms with a Structural Operational Semantics. Moreover, there exist a few approaches for automatically deriving axiomatizations characterizing strong bisimilarity of processes. To our knowledge, this literature has never been extended to the setting with data (e.g. to model storage and memory. We show how the rule formats for algebraic properties can be exploited in a generic manner in the setting with data. Moreover, we introduce a new approach for deriving sound and ground-complete axiom schemata for a notion of bisimilarity with data, called stateless bisimilarity, based on intuitive auxiliary function symbols for handling the store component. We do restrict, however, the axiomatization to the setting where the store component is only given in terms of constants.

  6. Reproducibility of wrist home blood pressure measurement with position sensor and automatic data storage

    Directory of Open Access Journals (Sweden)

    Nickenig Georg

    2009-05-01

    Full Text Available Abstract Background Wrist blood pressure (BP devices have physiological limits with regards to accuracy, therefore they were not preferred for home BP monitoring. However some wrist devices have been successfully validated using etablished validation protocols. Therefore this study assessed the reproducibility of wrist home BP measurement with position sensor and automatic data storage. Methods To compare the reproducibility of three different(BP measurement methods: 1 office BP, 2 home BP (Omron wrist device HEM- 637 IT with position sensor, 3 24-hour ambulatory BP(24-h ABPM (ABPM-04, Meditech, Hunconventional sphygmomanometric office BP was measured on study days 1 and 7, 24-h ABPM on study days 7 and 14 and home BP between study days 1 and 7 and between study days 8 and 14 in 69 hypertensive and 28 normotensive subjects. The correlation coeffcient of each BP measurement method with echocardiographic left ventricular mass index was analyzed. The schedule of home readings was performed according to recently published European Society of Hypertension (ESH- guidelines. Results The reproducibility of home BP measurement analyzed by the standard deviation as well as the squared differeces of mean individual differences between the respective BP measurements was significantly higher than the reproducibility of office BP (p Conclusion The short-term reproducibility of home BP measurement with the Omron HEM-637 IT wrist device was superior to the reproducibility of office BP and 24- h ABPM measurement. Furthermore, home BP with the wrist device showed similar correlations to targed organ damage as recently reported for upper arm devices. Although wrist devices have to be used cautious and with defined limitations, the use of validated devices with position sensor according to recently recommended measurement schedules might have the potential to be used for therapy monitoring.

  7. Automatic UAV Image Geo-Registration by Matching UAV Images to Georeferenced Image Data

    Directory of Open Access Journals (Sweden)

    Xiangyu Zhuo

    2017-04-01

    Full Text Available Recent years have witnessed the fast development of UAVs (unmanned aerial vehicles. As an alternative to traditional image acquisition methods, UAVs bridge the gap between terrestrial and airborne photogrammetry and enable flexible acquisition of high resolution images. However, the georeferencing accuracy of UAVs is still limited by the low-performance on-board GNSS and INS. This paper investigates automatic geo-registration of an individual UAV image or UAV image blocks by matching the UAV image(s with a previously taken georeferenced image, such as an individual aerial or satellite image with a height map attached or an aerial orthophoto with a DSM (digital surface model attached. As the biggest challenge for matching UAV and aerial images is in the large differences in scale and rotation, we propose a novel feature matching method for nadir or slightly tilted images. The method is comprised of a dense feature detection scheme, a one-to-many matching strategy and a global geometric verification scheme. The proposed method is able to find thousands of valid matches in cases where SIFT and ASIFT fail. Those matches can be used to geo-register the whole UAV image block towards the reference image data. When the reference images offer high georeferencing accuracy, the UAV images can also be geolocalized in a global coordinate system. A series of experiments involving different scenarios was conducted to validate the proposed method. The results demonstrate that our approach achieves not only decimeter-level registration accuracy, but also comparable global accuracy as the reference images.

  8. Advisory processes and their descriptive data

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2005-01-01

    Full Text Available Processes are regarded as a representative of all firm activities what is valid for Web-based Advisory Systems, too. Interpretation of processes from the both sides managers and informaticians is naturally different what is given by their scientific platforms and observed objectives. Managers have connected all firm processes with the firm prosperity and firm competition ability. Therefore they have followed understanding, modeling and regular improving of all processes what should stimulate and evoke using of process revisions (reengineering. The main role in such process understanding is thus committed to the firm management.The most professional computer process implementations are dominant objectives of Informaticians. In this conception all processes have been understood as real sequences of partial transactions (elementary firm activities and data processed by them regardless of using of a structural or object process approach modeling. The process and transaction models, submitted by informaticians, are connected with process content orientation. This content has to be programmed. The firm management represents the main resource of the process knowledge used by informaticians.In addition to these two process conceptions there is a different approach based on a process description by a descriptive data. The descriptive data are not oriented to a process content but to its theoretical conception and real implementation. The descriptive data processing inside special algebra operations can bring a lot of very important and easily economically interpreted results.

  9. Automatic Estimation of Live Coffee Leaf Infection based on Image Processing Techniques

    OpenAIRE

    Hitimana, Eric; Gwun, Oubong

    2014-01-01

    Image segmentation is the most challenging issue in computer vision applications. And most difficulties for crops management in agriculture ar e the lack of appropriate methods for detecting the leaf damage for pests’ treatment. In this paper we proposed an automatic method for leaf damage detection and severity estimation o f coffee leaf by avoiding defoliation. After enhancing the contrast of the original image using ...

  10. Sentinel-1 automatic processing chain for volcanic and seismic areas monitoring within the Geohazards Exploitation Platform (GEP)

    Science.gov (United States)

    De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco

    2016-04-01

    these issues, ESA recently funded the development of the Geohazards Exploitation Platform (GEP), a project aimed at putting together data, processing tools and results to make them accessible to the EO scientific community, with particular emphasis to the Geohazard Supersites & Natural Laboratories and the CEOS Seismic Hazards and Volcanoes Pilots. In this work we present the integration of the parallel version of a well-known DInSAR algorithm referred to as Small BAseline Subset (P-SBAS) within the GEP platform for processing Sentinel-1 data. The integration allowed us to set up an operational on-demand web tool, open to every user, aimed at automatically processing S1A data for the generation of SBAS displacement time-series. Main characteristics as well as a number of experimental results obtained by using the implemented web tool will be also shown. This work is partially supported by: the RITMARE project of Italian MIUR, the DPC-CNR agreement and the ESA GEP project.

  11. Designing a Method for AN Automatic Earthquake Intensities Calculation System Based on Data Mining and On-Line Polls

    Science.gov (United States)

    Liendo Sanchez, A. K.; Rojas, R.

    2013-05-01

    Seismic intensities can be calculated using the Modified Mercalli Intensity (MMI) scale or the European Macroseismic Scale (EMS-98), among others, which are based on a serie of qualitative aspects related to a group of subjective factors that describe human perception, effects on nature or objects and structural damage due to the occurrence of an earthquake. On-line polls allow experts to get an overview of the consequences of an earthquake, without going to the locations affected. However, this could be a hard work if the polls are not properly automated. Taking into account that the answers given to these polls are subjective and there is a number of them that have already been classified for some past earthquakes, it is possible to use data mining techniques in order to automate this process and to obtain preliminary results based on the on-line polls. In order to achieve these goal, a predictive model has been used, using a classifier based on a supervised learning techniques such as decision tree algorithm and a group of polls based on the MMI and EMS-98 scales. It summarized the most important questions of the poll, and recursive divides the instance space corresponding to each question (nodes), while each node splits the space depending on the possible answers. Its implementation was done with Weka, a collection of machine learning algorithms for data mining tasks, using the J48 algorithm which is an implementation of the C4.5 algorithm for decision tree models. By doing this, it was possible to obtain a preliminary model able to identify up to 4 different seismic intensities with 73% correctly classified polls. The error obtained is rather high, therefore, we will update the on-line poll in order to improve the results, based on just one scale, for instance the MMI. Besides, the integration of automatic seismic intensities methodology with a low error probability and a basic georeferencing system, will allow to generate preliminary isoseismal maps

  12. Automatic classification of atypical lymphoid B cells using digital blood image processing.

    Science.gov (United States)

    Alférez, S; Merino, A; Mujica, L E; Ruiz, M; Bigorra, L; Rodellar, J

    2014-08-01

    There are automated systems for digital peripheral blood (PB) cell analysis, but they operate most effectively in nonpathological blood samples. The objective of this work was to design a methodology to improve the automatic classification of abnormal lymphoid cells. We analyzed 340 digital images of individual lymphoid cells from PB films obtained in the CellaVision DM96:150 chronic lymphocytic leukemia (CLL) cells, 100 hairy cell leukemia (HCL) cells, and 90 normal lymphocytes (N). We implemented the Watershed Transformation to segment the nucleus, the cytoplasm, and the peripheral cell region. We extracted 44 features and then the clustering Fuzzy C-Means (FCM) was applied in two steps for the lymphocyte classification. The images were automatically clustered in three groups, one of them with 98% of the HCL cells. The set of the remaining cells was clustered again using FCM and texture features. The two new groups contained 83.3% of the N cells and 71.3% of the CLL cells, respectively. The approach has been able to automatically classify with high precision three types of lymphoid cells. The addition of more descriptors and other classification techniques will allow extending the classification to other classes of atypical lymphoid cells. © 2013 John Wiley & Sons Ltd.

  13. ALGORITHM FOR THE AUTOMATIC ESTIMATION OF AGRICULTURAL TREE GEOMETRIC PARAMETERS USING AIRBORNE LASER SCANNING DATA

    Directory of Open Access Journals (Sweden)

    E. Hadaś

    2016-06-01

    Full Text Available The estimation of dendrometric parameters has become an important issue for the agricultural planning and management. Since the classical field measurements are time consuming and inefficient, Airborne Laser Scanning (ALS data can be used for this purpose. Point clouds acquired for orchard areas allow to determine orchard structures and geometric parameters of individual trees. In this research we propose an automatic method that allows to determine geometric parameters of individual olive trees using ALS data. The method is based on the α-shape algorithm applied for normalized point clouds. The algorithm returns polygons representing crown shapes. For points located inside each polygon, we select the maximum height and the minimum height and then we estimate the tree height and the crown base height. We use the first two components of the Principal Component Analysis (PCA as the estimators for crown diameters. The α-shape algorithm requires to define the radius parameter R. In this study we investigated how sensitive are the results to the radius size, by comparing the results obtained with various settings of the R with reference values of estimated parameters from field measurements. Our study area was the olive orchard located in the Castellon Province, Spain. We used a set of ALS data with an average density of 4 points m−2. We noticed, that there was a narrow range of the R parameter, from 0.48 m to 0.80 m, for which all trees were detected and for which we obtained a high correlation coefficient (> 0.9 between estimated and measured values. We compared our estimates with field measurements. The RMSE of differences was 0.8 m for the tree height, 0.5 m for the crown base height, 0.6 m and 0.4 m for the longest and shorter crown diameter, respectively. The accuracy obtained with the method is thus sufficient for agricultural applications.

  14. MarsSI: Martian surface data processing information system

    Science.gov (United States)

    Quantin-Nataf, C.; Lozac'h, L.; Thollot, P.; Loizeau, D.; Bultel, B.; Fernando, J.; Allemand, P.; Dubuffet, F.; Poulet, F.; Ody, A.; Clenet, H.; Leyrat, C.; Harrisson, S.

    2018-01-01

    MarsSI (Acronym for Mars System of Information, https://emars.univ-lyon1.fr/MarsSI/, is a web Geographic Information System application which helps managing and processing martian orbital data. The MarsSI facility is part of the web portal called PSUP (Planetary SUrface Portal) developed by the Observatories of Paris Sud (OSUPS) and Lyon (OSUL) to provide users with efficient and easy access to data products dedicated to the martian surface. The portal proposes 1) the management and processing of data thanks to MarsSI and 2) the visualization and merging of high level (imagery, spectral, and topographic) products and catalogs via a web-based user interface (MarsVisu). The portal PSUP as well as the facility MarsVisu is detailed in a companion paper (Poulet et al., 2018). The purpose of this paper is to describe the facility MarsSI. From this application, users are able to easily and rapidly select observations, process raw data via automatic pipelines, and get back final products which can be visualized under Geographic Information Systems. Moreover, MarsSI also contains an automatic stereo-restitution pipeline in order to produce Digital Terrain Models (DTM) on demand from HiRISE (High Resolution Imaging Science Experiment) or CTX (Context Camera) pair-images. This application is funded by the European Union's Seventh Framework Programme (FP7/2007-2013) (ERC project eMars, No. 280168) and has been developed in the scope of Mars, but the design is applicable to any other planetary body of the solar system.

  15. Automatic Detection and Classification of Pole-Like Objects for Urban Cartography Using Mobile Laser Scanning Data.

    Science.gov (United States)

    Ordóñez, Celestino; Cabo, Carlos; Sanz-Ablanedo, Enoc

    2017-06-22

    Mobile laser scanning (MLS) is a modern and powerful technology capable of obtaining massive point clouds of objects in a short period of time. Although this technology is nowadays being widely applied in urban cartography and 3D city modelling, it has some drawbacks that need to be avoided in order to strengthen it. One of the most important shortcomings of MLS data is concerned with the fact that it provides an unstructured dataset whose processing is very time-consuming. Consequently, there is a growing interest in developing algorithms for the automatic extraction of useful information from MLS point clouds. This work is focused on establishing a methodology and developing an algorithm to detect pole-like objects and classify them into several categories using MLS datasets. The developed procedure starts with the discretization of the point cloud by means of a voxelization, in order to simplify and reduce the processing time in the segmentation process. In turn, a heuristic segmentation algorithm was developed to detect pole-like objects in the MLS point cloud. Finally, two supervised classification algorithms, linear discriminant analysis and support vector machines, were used to distinguish between the different types of poles in the point cloud. The predictors are the principal component eigenvalues obtained from the Cartesian coordinates of the laser points, the range of the Z coordinate, and some shape-related indexes. The performance of the method was tested in an urban area with 123 poles of different categories. Very encouraging results were obtained, since the accuracy rate was over 90%.

  16. Automatic Detection and Classification of Pole-Like Objects for Urban Cartography Using Mobile Laser Scanning Data

    Directory of Open Access Journals (Sweden)

    Celestino Ordóñez

    2017-06-01

    Full Text Available Mobile laser scanning (MLS is a modern and powerful technology capable of obtaining massive point clouds of objects in a short period of time. Although this technology is nowadays being widely applied in urban cartography and 3D city modelling, it has some drawbacks that need to be avoided in order to strengthen it. One of the most important shortcomings of MLS data is concerned with the fact that it provides an unstructured dataset whose processing is very time-consuming. Consequently, there is a growing interest in developing algorithms for the automatic extraction of useful information from MLS point clouds. This work is focused on establishing a methodology and developing an algorithm to detect pole-like objects and classify them into several categories using MLS datasets. The developed procedure starts with the discretization of the point cloud by means of a voxelization, in order to simplify and reduce the processing time in the segmentation process. In turn, a heuristic segmentation algorithm was developed to detect pole-like objects in the MLS point cloud. Finally, two supervised classification algorithms, linear discriminant analysis and support vector machines, were used to distinguish between the different types of poles in the point cloud. The predictors are the principal component eigenvalues obtained from the Cartesian coordinates of the laser points, the range of the Z coordinate, and some shape-related indexes. The performance of the method was tested in an urban area with 123 poles of different categories. Very encouraging results were obtained, since the accuracy rate was over 90%.

  17. Seismic processing in the inverse data space

    NARCIS (Netherlands)

    Berkhout, A.J.

    2006-01-01

    Until now, seismic processing has been carried out by applying inverse filters in the forward data space. Because the acquired data of a seismic survey is always discrete, seismic measurements in the forward data space can be arranged conveniently in a data matrix (P). Each column in the data matrix

  18. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  19. Automatic generation of medium-detailed 3D models of buildings based on CAD data

    NARCIS (Netherlands)

    Dominguez-Martin, B.; Van Oosterom, P.; Feito-Higueruela, F.R.; Garcia-Fernandez, A.L.; Ogayar-Anguita, C.J.

    2015-01-01

    We present the preliminary results of a work in progress which aims to obtain a software system able to automatically generate a set of diverse 3D building models with a medium level of detail, that is, more detailed that a mere parallelepiped, but not as detailed as a complete geometric

  20. Machine Beats Experts: Automatic Discovery of Skill Models for Data-Driven Online Course Refinement

    Science.gov (United States)

    Matsuda, Noboru; Furukawa, Tadanobu; Bier, Norman; Faloutsos, Christos

    2015-01-01

    How can we automatically determine which skills must be mastered for the successful completion of an online course? Large-scale online courses (e.g., MOOCs) often contain a broad range of contents frequently intended to be a semester's worth of materials; this breadth often makes it difficult to articulate an accurate set of skills and knowledge…

  1. Comparative analysis of automatic approaches to building detection from multi-source aerial data

    NARCIS (Netherlands)

    Frontoni, E.; Khoshelham, K.; Nardinocchi, C.; Nedkov, S.; Zingaretti, P.

    2008-01-01

    Automatic building detection has been a hot topic since the early 1990’s. Early approaches were based on a single aerial image. Detecting buildings is a difficult task so it can be more effective when multiple sources of information are obtained and fused. The objective of this paper is to provide a

  2. Data processing framework for decision making

    DEFF Research Database (Denmark)

    Larsen, Jan

    The aim of the talk is * to provide insight into some of the issues in data processing and detection systems * to hint at possible solutions using statistical signal processing and machine learning methodologies...

  3. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  4. Data near processing support for climate data analysis

    Science.gov (United States)

    Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils

    2016-04-01

    Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted

  5. Optimal planning and management of underground mines by the interpretation of geological data and their automatic processing during mining. Planification et conduite optimisees des exploitations fondees sur l'interpretation des donnees geologiques et leur traitement automatique en cours de l'exploitation; Rapport final

    Energy Technology Data Exchange (ETDEWEB)

    Dreesen, R.; Vaesen, W.; Maquet, J.M.; Lorenzi, G.; Triplot, G.; Halleux, I. (Institut National des Industries Extractives, Liege (Belgium))

    1991-01-01

    This research project focused on predicting the composition of several coal seams currently being worked in different coal faces, within two neighbouring collieries of the Campine coal field (N.V. Kempense Steenkolenmijnen). This prediction is based on the interpretation of computer-processed geological data collected from the colliery archives or obtained during mining operations. Existing software packages were adapted to process the geological data. The different computer-generated graphic documents have been evaluated for their use as predictive tools in coal mining. It is noted that the reliability of subsurface geological observations is an absolute requirement for optimal underground coal mining operations. 27 refs., 54 figs.

  6. Processing Solutions for Big Data in Astronomy

    Science.gov (United States)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  7. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  8. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  9. ProMoIJ: A new tool for automatic three-dimensional analysis of microglial process motility.

    Science.gov (United States)

    Paris, Iñaki; Savage, Julie C; Escobar, Laura; Abiega, Oihane; Gagnon, Steven; Hui, Chin-Wai; Tremblay, Marie-Ève; Sierra, Amanda; Valero, Jorge

    2018-04-01

    Microglia, the immune cells of the central nervous system, continuously survey the brain to detect alterations and maintain tissue homeostasis. The motility of microglial processes is indicative of their surveying capacity in normal and pathological conditions. The gold standard technique to study motility involves the use of two-photon microscopy to obtain time-lapse images from brain slices or the cortex of living animals. This technique generates four dimensionally-coded images which are analyzed manually using time-consuming, non-standardized protocols. Microglial process motility analysis is frequently performed using Z-stack projections with the consequent loss of three-dimensional (3D) information. To overcome these limitations, we developed ProMoIJ, a pack of ImageJ macros that perform automatic motility analysis of cellular processes in 3D. The main core of ProMoIJ is formed by two macros that assist the selection of processes, automatically reconstruct their 3D skeleton, and analyze their motility (process and tip velocity). Our results show that ProMoIJ presents several key advantages compared with conventional manual analysis: (1) reduces the time required for analysis, (2) is less sensitive to experimenter bias, and (3) is more robust to varying numbers of processes analyzed. In addition, we used ProMoIJ to demonstrate that commonly performed 2D analysis underestimates microglial process motility, to reveal that only cells adjacent to a laser injured area extend their processes toward the lesion site, and to demonstrate that systemic inflammation reduces microglial process motility. ProMoIJ is a novel, open-source, freely-available tool which standardizes and accelerates the time-consuming labor of 3D analysis of microglial process motility. © 2017 Wiley Periodicals, Inc.

  10. A Review of Automatic Methods Based on Image Processing Techniques for Tuberculosis Detection from Microscopic Sputum Smear Images.

    Science.gov (United States)

    Panicker, Rani Oomman; Soman, Biju; Saini, Gagan; Rajan, Jeny

    2016-01-01

    Tuberculosis (TB) is an infectious disease caused by the bacteria Mycobacterium tuberculosis. It primarily affects the lungs, but it can also affect other parts of the body. TB remains one of the leading causes of death in developing countries, and its recent resurgences in both developed and developing countries warrant global attention. The number of deaths due to TB is very high (as per the WHO report, 1.5 million died in 2013), although most are preventable if diagnosed early and treated. There are many tools for TB detection, but the most widely used one is sputum smear microscopy. It is done manually and is often time consuming; a laboratory technician is expected to spend at least 15 min per slide, limiting the number of slides that can be screened. Many countries, including India, have a dearth of properly trained technicians, and they often fail to detect TB cases due to the stress of a heavy workload. Automatic methods are generally considered as a solution to this problem. Attempts have been made to develop automatic approaches to identify TB bacteria from microscopic sputum smear images. In this paper, we provide a review of automatic methods based on image processing techniques published between 1998 and 2014. The review shows that the accuracy of algorithms for the automatic detection of TB increased significantly over the years and gladly acknowledges that commercial products based on published works also started appearing in the market. This review could be useful to researchers and practitioners working in the field of TB automation, providing a comprehensive and accessible overview of methods of this field of research.

  11. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    Science.gov (United States)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  12. Apache Flink: Distributed Stream Data Processing

    CERN Document Server

    Jacobs, Kevin; CERN. Geneva. IT Department

    2016-01-01

    The amount of data is growing significantly over the past few years. Therefore, the need for distributed data processing frameworks is growing. Currently, there are two well-known data processing frameworks with an API for data batches and an API for data streams which are named Apache Flink and Apache Spark. Both Apache Spark and Apache Flink are improving upon the MapReduce implementation of the Apache Hadoop framework. MapReduce is the first programming model for distributed processing on large scale that is available in Apache Hadoop. This report compares the Stream API and the Batch API for both frameworks.

  13. Automatized image processing of bovine blastocysts produced in vitro for quantitative variable determination

    Science.gov (United States)

    Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Maserati, Marc Peter, Jr.; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia

    2017-12-01

    There is currently no objective, real-time and non-invasive method for evaluating the quality of mammalian embryos. In this study, we processed images of in vitro produced bovine blastocysts to obtain a deeper comprehension of the embryonic morphological aspects that are related to the standard evaluation of blastocysts. Information was extracted from 482 digital images of blastocysts. The resulting imaging data were individually evaluated by three experienced embryologists who graded their quality. To avoid evaluation bias, each image was related to the modal value of the evaluations. Automated image processing produced 36 quantitative variables for each image. The images, the modal and individual quality grades, and the variables extracted could potentially be used in the development of artificial intelligence techniques (e.g., evolutionary algorithms and artificial neural networks), multivariate modelling and the study of defined structures of the whole blastocyst.

  14. Reproducibility of wrist home blood pressure measurement with position sensor and automatic data storage

    Science.gov (United States)

    Uen, Sakir; Fimmers, Rolf; Brieger, Miriam; Nickenig, Georg; Mengden, Thomas

    2009-01-01

    Background Wrist blood pressure (BP) devices have physiological limits with regards to accuracy, therefore they were not preferred for home BP monitoring. However some wrist devices have been successfully validated using etablished validation protocols. Therefore this study assessed the reproducibility of wrist home BP measurement with position sensor and automatic data storage. Methods To compare the reproducibility of three different(BP) measurement methods: 1) office BP, 2) home BP (Omron wrist device HEM- 637 IT with position sensor), 3) 24-hour ambulatory BP(24-h ABPM) (ABPM-04, Meditech, Hun)conventional sphygmomanometric office BP was measured on study days 1 and 7, 24-h ABPM on study days 7 and 14 and home BP between study days 1 and 7 and between study days 8 and 14 in 69 hypertensive and 28 normotensive subjects. The correlation coeffcient of each BP measurement method with echocardiographic left ventricular mass index was analyzed. The schedule of home readings was performed according to recently published European Society of Hypertension (ESH)- guidelines. Results The reproducibility of home BP measurement analyzed by the standard deviation as well as the squared differeces of mean individual differences between the respective BP measurements was significantly higher than the reproducibility of office BP (p ABPM (p ABPM was not significantly different (p = 0.80 systolic BP, p = 0.1 diastolic BP). The correlation coefficient of 24-h ABMP (r = 0.52) with left ventricular mass index was significantly higher than with office BP (r = 0.31). The difference between 24-h ABPM and home BP (r = 0.46) was not significant. Conclusion The short-term reproducibility of home BP measurement with the Omron HEM-637 IT wrist device was superior to the reproducibility of office BP and 24- h ABPM measurement. Furthermore, home BP with the wrist device showed similar correlations to targed organ damage as recently reported for upper arm devices. Although wrist devices have

  15. Age effects shrink when motor learning is predominantly supported by nondeclarative, automatic memory processes: evidence from golf putting.

    Science.gov (United States)

    Chauvel, Guillaume; Maquestiaux, François; Hartley, Alan A; Joubert, Sven; Didierjean, André; Masters, Rich S W

    2012-01-01

    Can motor learning be equivalent in younger and older adults? To address this question, 48 younger (M = 23.5 years) and 48 older (M = 65.0 years) participants learned to perform a golf-putting task in two different motor learning situations: one that resulted in infrequent errors or one that resulted in frequent errors. The results demonstrated that infrequent-error learning predominantly relied on nondeclarative, automatic memory processes whereas frequent-error learning predominantly relied on declarative, effortful memory processes: After learning, infrequent-error learners verbalized fewer strategies than frequent-error learners; at transfer, a concurrent, attention-demanding secondary task (tone counting) left motor performance of infrequent-error learners unaffected but impaired that of frequent-error learners. The results showed age-equivalent motor performance in infrequent-error learning but age deficits in frequent-error learning. Motor performance of frequent-error learners required more attention with age, as evidenced by an age deficit on the attention-demanding secondary task. The disappearance of age effects when nondeclarative, automatic memory processes predominated suggests that these processes are preserved with age and are available even early in motor learning.

  16. ACRF Data Collection and Processing Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, M; Egan, D

    2004-12-01

    We present a description of the data flow from measurement to long-term archive. We also discuss data communications infrastructure. The data handling processes presented include collection, transfer, ingest, quality control, creation of Value-Added Products (VAP), and data archiving.

  17. 12 CFR 7.5006 - Data processing.

    Science.gov (United States)

    2010-01-01

    ... Electronic Activities § 7.5006 Data processing. (a) Eligible activities. It is part of the business of... services, facilities (including equipment, technology, and personnel), data bases, advice and access to such services, facilities, data bases and advice, for itself and for others, where the data is banking...

  18. A dynamically reconfigurable data stream processing system

    Energy Technology Data Exchange (ETDEWEB)

    Nogiec, J.M.; Trombly-Freytag, K.; /Fermilab

    2004-11-01

    This paper describes a component-based framework for data stream processing that allows for configuration, tailoring, and runtime system reconfiguration. The system's architecture is based on a pipes and filters pattern, where data is passed through routes between components. A network of pipes and filters can be dynamically reconfigured in response to a preplanned sequence of processing steps, operator intervention, or a change in one or more data streams. This framework provides several mechanisms supporting dynamic reconfiguration and can be used to build static data stream processing applications such as monitoring or data acquisition systems, as well as self-adjusting systems that can adapt their processing algorithm, presentation layer, or data persistency layer in response to changes in input data streams.

  19. The software and algorithms for hyperspectral data processing

    Science.gov (United States)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages

  20. Using Probe Vehicle Data for Automatic Extraction of Road Traffic Parameters

    Directory of Open Access Journals (Sweden)

    Roman Popescu Maria Alexandra

    2016-12-01

    Full Text Available Through this paper the author aims to study and find solutions for automatic detection of traffic light position and for automatic calculation of the waiting time at traffic light. The first objective serves mainly the road transportation field, mainly because it removes the need for collaboration with local authorities to establish a national network of traffic lights. The second objective is important not only for companies which are providing navigation solutions, but especially for authorities, institutions, companies operating in road traffic management systems. Real-time dynamic determination of traffic queue length and of waiting time at traffic lights allow the creation of dynamic systems, intelligent and flexible, adapted to actual traffic conditions, and not to generic, theoretical models. Thus, cities can approach the Smart City concept by boosting, efficienting and greening the road transport, promoted in Europe through the Horizon 2020, Smart Cities, Urban Mobility initiative.

  1. Visualization of a newborn's hip joint using 3D ultrasound and automatic image processing

    Science.gov (United States)

    Overhoff, Heinrich M.; Lazovic, Djordje; von Jan, Ute

    1999-05-01

    Graf's method is a successful procedure for the diagnostic screening of developmental dysplasia of the hip. In a defined 2-D ultrasound (US) scan, which virtually cuts the hip joint, landmarks are interactively identified to derive congruence indicators. As the indicators do not reflect the spatial joint structure, and the femoral head is not clearly visible in the US scan, here 3-D US is used to gain insight to the hip joint in its spatial form. Hip joints of newborns were free-hand scanned using a conventional ultrasound transducer and a localizer system fixed on the scanhead. To overcome examiner- dependent findings the landmarks were detected by automatic segmentation of the image volume. The landmark image volumes and an automatically determined virtual sphere approximating the femoral head were visualized color-coded on a computer screen. The visualization was found to be intuitive and to simplify the diagnostic substantially. By the visualization of the 3-D relations between acetabulum and femoral head the reliability of diagnostics is improved by finding the entire joint geometry.

  2. Processing Contexts for Experimental HEP Data

    Energy Technology Data Exchange (ETDEWEB)

    Paterno, Marc [Fermilab; Green, Chris [Fermilab

    2017-02-06

    This document provides, for those not closely associated with the experimental High Energy Physics (HEP) community, an introduction to data input and output requirements for a variety of data processing tasks. Examples in it are drawn from the art event processing framework, and from experiments and projects using art, most notably the LArSoft and NuTools projects.

  3. Cardiorespiratory concerns shape brain responses during automatic panic-related scene processing in patients with panic disorder

    Science.gov (United States)

    Feldker, Katharina; Heitmann, Carina Yvonne; Neumeister, Paula; Brinkmann, Leonie; Bruchmann, Maximillan; Zwitserlood, Pienie; Straube, Thomas

    2018-01-01

    Background Increased automatic processing of threat-related stimuli has been proposed as a key element in panic disorder. Little is known about the neural basis of automatic processing, in particular to task-irrelevant, panic-related, ecologically valid stimuli, or about the association between brain activation and symptomatology in patients with panic disorder. Methods The present event-related functional MRI (fMRI) study compared brain responses to task-irrelevant, panic-related and neutral visual stimuli in medication-free patients with panic disorder and healthy controls. Panic-related and neutral scenes were presented while participants performed a spatially non-overlapping bar orientation task. Correlation analyses investigated the association between brain responses and panic-related aspects of symptomatology, measured using the Anxiety Sensitivity Index (ASI). Results We included 26 patients with panic disorder and 26 heatlhy controls in our analysis. Compared with controls, patients with panic disorder showed elevated activation in the amygdala, brainstem, thalamus, insula, anterior cingulate cortex and midcingulate cortex in response to panic-related versus neutral task-irrelevant stimuli. Furthermore, fear of cardiovascular symptoms (a subcomponent of the ASI) was associated with insula activation, whereas fear of respiratory symptoms was associated with brainstem hyperactivation in patients with panic disorder. Limitations The additional implementation of measures of autonomic activation, such as pupil diameter, heart rate, or electrodermal activity, would have been informative during the fMRI scan as well as during the rating procedure. Conclusion Results reveal a neural network involved in the processing of panic-related distractor stimuli in patients with panic disorder and suggest an automatic weighting of panic-related information depending on the magnitude of cardiovascular and respiratory symptoms. Insula and brainstem activations show function

  4. Cardiorespiratory concerns shape brain responses during automatic panic-related scene processing in patients with panic disorder.

    Science.gov (United States)

    Feldker, Katharina; Heitmann, Carina Yvonne; Neumeister, Paula; Brinkmann, Leonie; Bruchmann, Maximillan; Zwitserlood, Pienie; Straube, Thomas

    2018-01-01

    Increased automatic processing of threat-related stimuli has been proposed as a key element in panic disorder. Little is known about the neural basis of automatic processing, in particular to task-irrelevant, panic-related, ecologically valid stimuli, or about the association between brain activation and symptomatology in patients with panic disorder. The present event-related functional MRI (fMRI) study compared brain responses to task-irrelevant, panic-related and neutral visual stimuli in medication-free patients with panic disorder and healthy controls. Panic-related and neutral scenes were presented while participants performed a spatially nonoverlapping bar orientation task. Correlation analyses investigated the association between brain responses and panic-related aspects of symptomatology, measured using the Anxiety Sensitivity Index (ASI). We included 26 patients with panic disorder and 26 heatlhy controls in our analysis. Compared with controls, patients with panic disorder showed elevated activation in the amygdala, brainstem, thalamus, insula, anterior cingulate cortex and midcingulate cortex in response to panic-related versus neutral task-irrelevant stimuli. Furthermore, fear of cardiovascular symptoms (a subcomponent of the ASI) was associated with insula activation, whereas fear of respiratory symptoms was associated with brainstem hyperactivation in patients with panic disorder. The additional implementation of measures of autonomic activation, such as pupil diameter, heart rate, or electrodermal activity, would have been informative during the fMRI scan as well as during the rating procedure. Results reveal a neural network involved in the processing of panic-related distractor stimuli in patients with panic disorder and suggest an automatic weighting of panic-related information depending on the magnitude of cardiovascular and respiratory symptoms. Insula and brainstem activations show function-related associations with specific components of

  5. Process mining data science in action

    CERN Document Server

    van der Aalst, Wil

    2016-01-01

    The first to cover this missing link between data mining and process modeling, this book provides real-world techniques for monitoring and analyzing processes in real time. It is a powerful new tool destined to play a key role in business process management.

  6. Automatic multi-modal intelligent seizure acquisition (MISA) system for detection of motor seizures from electromyographic data and motion data

    DEFF Research Database (Denmark)

    Conradsen, Isa; Beniczky, Sándor; Wolf, Peter

    2012-01-01

    The objective is to develop a non-invasive automatic method for detection of epileptic seizures with motor manifestations. Ten healthy subjects who simulated seizures and one patient participated in the study. Surface electromyography (sEMG) and motion sensor features were extracted as energy...

  7. Integration of knowledge to support automatic object reconstruction from images and 3D data

    International Nuclear Information System (INIS)

    Boochs, F.; Truong, H; Marbs, A.; Karmacharya, A.; Cruz, C.; Habed, A.; Nicolle, C.; Voisin, Y.

    2011-01-01

    Object reconstruction is a important task in many fields of application as it allows to generate digital representations of our physical world used as base for analysis, planning, construction, visualization or other aims. A reconstruction itself normally is based on reliable data (images, 3D point clouds for example) expressing the object in his complete extension. This data then has to be compiled and analyzed in order to extract all necessary geometrical elements, which represent the object and form a digital copy of it. Traditional strategies are largely based on manual interaction and interpretation, because with increasing complexity of objects human understanding is inevitable to achieve acceptable and reliable results. But human interaction is time consuming and expensive, why many research has already been invested to integrate algorithmic support, what allows to speed up the process and reduce manual work load. Presently most such algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. By means of these models, which normally will represent geometrical (flatness, roughness, for example) or physical features (color, texture), the data is classified and analyzed. This is succesful for objects with a limited complexity, but gets to its limits with increasing complexity of objects. Then purely numerical strategies are not able to sufficiently model the reality. Therefore, the intention of our approach is to take human cogni-tive strategy as an example, and to simulate extraction processes based on available knowledge for the objects of interest. Such processes will introduce a semantic structure for the objects and guide the algorithms used to detect and recognize objects, which will yield a higher effectiveness. Hence, our research proposes an approach using knowledge to guide the algorithms in 3D point cloud and image processing.

  8. Organization of film data processing in the PPI-SA automated system

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Perekatov, V.G.

    1984-01-01

    Organization of processing nuclear interaction images at PUOS - type standard devices using the PPI-SA automated system is considered. The system is made in the form of a complete module comprising two scanning measuring projectors and a scan-ning automatic device which operate in real time on line with the BESM-4-computer. The system comprises: subsystem for photographic film scanning, selection of events for measurements and preliminary encoding; subsystem for formation and generation of libraries with data required for monitoring the scanning automatic device; subsystem for precision measurements separate coordinates on photo images of nuclear particle tracks and ionization losses. The system software comprises monitoring programs for the projectors and scanning automatic device as well as test functional control programs and operating system. The programs are organized a modular concept. By changing the module set the system can be modified and adapted for image processing in different fields of science and technology

  9. Controlling the COD removal of an A-stage pilot study with instrumentation and automatic process control.

    Science.gov (United States)

    Miller, Mark W; Elliott, Matt; DeArmond, Jon; Kinyua, Maureen; Wett, Bernhard; Murthy, Sudhir; Bott, Charles B

    2017-06-01

    The pursuit of fully autotrophic nitrogen removal via the anaerobic ammonium oxidation (anammox) pathway has led to an increased interest in carbon removal technologies, particularly the A-stage of the adsorption/bio-oxidation (A/B) process. The high-rate operation of the A-stage and lack of automatic process control often results in wide variations of chemical oxygen demand (COD) removal that can ultimately impact nitrogen removal in the downstream B-stage process. This study evaluated the use dissolved oxygen (DO) and mixed liquor suspended solids (MLSS) based automatic control strategies through the use of in situ on-line sensors in the A-stage of an A/B pilot study. The objective of using these control strategies was to reduce the variability of COD removal by the A-stage and thus the variability of the effluent C/N. The use of cascade DO control in the A-stage did not impact COD removal at the conditions tested in this study, likely because the bulk DO concentration (>0.5 mg/L) was maintained above the half saturation coefficient of heterotrophic organisms for DO. MLSS-based solids retention time (SRT) control, where MLSS was used as a surrogate for SRT, did not significantly reduce the effluent C/N variability but it was able to reduce COD removal variation in the A-stage by 90%.

  10. Data processing device for computed tomography system

    International Nuclear Information System (INIS)

    Nakayama, N.; Ito, Y.; Iwata, K.; Nishihara, E.; Shibayama, S.

    1984-01-01

    A data processing device applied to a computed tomography system which examines a living body utilizing radiation of X-rays is disclosed. The X-rays which have penetrated the living body are converted into electric signals in a detecting section. The electric signals are acquired and converted from an analog form into a digital form in a data acquisition section, and then supplied to a matrix data-generating section included in the data processing device. By this matrix data-generating section are generated matrix data which correspond to a plurality of projection data. These matrix data are supplied to a partial sum-producing section. The partial sums respectively corresponding to groups of the matrix data are calculated in this partial sum-producing section and then supplied to an accumulation section. In this accumulation section, the final value corresponding to the total sum of the matrix data is calculated, whereby the calculation for image reconstruction is performed

  11. Rapid, semi-automatic fracture and contact mapping for point clouds, images and geophysical data

    Science.gov (United States)

    Thiele, Samuel T.; Grose, Lachlan; Samsu, Anindita; Micklethwaite, Steven; Vollgger, Stefan A.; Cruden, Alexander R.

    2017-12-01

    The advent of large digital datasets from unmanned aerial vehicle (UAV) and satellite platforms now challenges our ability to extract information across multiple scales in a timely manner, often meaning that the full value of the data is not realised. Here we adapt a least-cost-path solver and specially tailored cost functions to rapidly interpolate structural features between manually defined control points in point cloud and raster datasets. We implement the method in the geographic information system QGIS and the point cloud and mesh processing software CloudCompare. Using these implementations, the method can be applied to a variety of three-dimensional (3-D) and two-dimensional (2-D) datasets, including high-resolution aerial imagery, digital outcrop models, digital elevation models (DEMs) and geophysical grids. We demonstrate the algorithm with four diverse applications in which we extract (1) joint and contact patterns in high-resolution orthophotographs, (2) fracture patterns in a dense 3-D point cloud, (3) earthquake surface ruptures of the Greendale Fault associated with the Mw7.1 Darfield earthquake (New Zealand) from high-resolution light detection and ranging (lidar) data, and (4) oceanic fracture zones from bathymetric data of the North Atlantic. The approach improves the consistency of the interpretation process while retaining expert guidance and achieves significant improvements (35-65 %) in digitisation time compared to traditional methods. Furthermore, it opens up new possibilities for data synthesis and can quantify the agreement between datasets and an interpretation.

  12. Automatically visualise and analyse data on pathways using PathVisioRPC from any programming environment.

    Science.gov (United States)

    Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T

    2015-08-23

    Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be

  13. Statistical processing of technological and radiochemical data

    International Nuclear Information System (INIS)

    Lahodova, Zdena; Vonkova, Kateřina

    2011-01-01

    The project described in this article had two goals. The main goal was to compare technological and radiochemical data from two units of nuclear power plant. The other goal was to check the collection, organization and interpretation of routinely measured data. Monitoring of analytical and radiochemical data is a very valuable source of knowledge for some processes in the primary circuit. Exploratory analysis of one-dimensional data was performed to estimate location and variability and to find extreme values, data trends, distribution, autocorrelation etc. This process allowed for the cleaning and completion of raw data. Then multiple analyses such as multiple comparisons, multiple correlation, variance analysis, and so on were performed. Measured data was organized into a data matrix. The results and graphs such as Box plots, Mahalanobis distance, Biplot, Correlation, and Trend graphs are presented in this article as statistical analysis tools. Tables of data were replaced with graphs because graphs condense large amounts of information into easy-to-understand formats. The significant conclusion of this work is that the collection and comprehension of data is a very substantial part of statistical processing. With well-prepared and well-understood data, its accurate evaluation is possible. Cooperation between the technicians who collect data and the statistician who processes it is also very important. (author)

  14. Evaluation and processing of nuclear data

    International Nuclear Information System (INIS)

    Pearlstein, S.

    1980-01-01

    The role a nuclear data evaluator plays in obtaining evaluated nuclear data, needed for applications, from measured nuclear data is surveyed. Specific evaluation objectives, problems, and procedures are discussed. The use of nuclear systematics to complement nuclear experiment and theory is described. With the Evaluated Nuclear Data File (ENDF) as an example, the formatting, checking, and processing of nuclear data are discussed as well as the testing of evaluated nuclear data in the calculation of integral benchmark experiments. Other important topics such as the Probability Table Method and interrelation between differential and integral data are also discussed. 25 figures

  15. Automatic analysis of online image data for law enforcement agencies by concept detection and instance search

    Science.gov (United States)

    de Boer, Maaike H. T.; Bouma, Henri; Kruithof, Maarten C.; ter Haar, Frank B.; Fischer, Noëlle M.; Hagendoorn, Laurens K.; Joosten, Bart; Raaijmakers, Stephan

    2017-10-01

    The information available on-line and off-line, from open as well as from private sources, is growing at an exponential rate and places an increasing demand on the limited resources of Law Enforcement Agencies (LEAs). The absence of appropriate tools and techniques to collect, process, and analyze the volumes of complex and heterogeneous data has created a severe information overload. If a solution is not found, the impact on law enforcement will be dramatic, e.g. because important evidence is missed or the investigation time is too long. Furthermore, there is an uneven level of capabilities to deal with the large volumes of complex and heterogeneous data that come from multiple open and private sources at national level across the EU, which hinders cooperation and information sharing. Consequently, there is a pertinent need to develop tools, systems and processes which expedite online investigations. In this paper, we describe a suite of analysis tools to identify and localize generic concepts, instances of objects and logos in images, which constitutes a significant portion of everyday law enforcement data. We describe how incremental learning based on only a few examples and large-scale indexing are addressed in both concept detection and instance search. Our search technology allows querying of the database by visual examples and by keywords. Our tools are packaged in a Docker container to guarantee easy deployment on a system and our tools exploit possibilities provided by open source toolboxes, contributing to the technical autonomy of LEAs.

  16. Automatic classification techniques for type of sediment map from multibeam sonar data

    Science.gov (United States)

    Zakariya, R.; Abdullah, M. A.; Che Hasan, R.; Khalil, I.

    2018-02-01

    Sediment map can be important information for various applications such as oil drilling, environmental and pollution study. A study on sediment mapping was conducted at a natural reef (rock) in Pulau Payar using Sound Navigation and Ranging (SONAR) technology which is Multibeam Echosounder R2-Sonic. This study aims to determine sediment type by obtaining backscatter and bathymetry data from multibeam echosounder. Ground truth data were used to verify the classification produced. The method used to analyze ground truth samples consists of particle size analysis (PSA) and dry sieving methods. Different analysis being carried out due to different sizes of sediment sample obtained. The smaller size was analyzed using PSA with the brand CILAS while bigger size sediment was analyzed using sieve. For multibeam, data acquisition includes backscatter strength and bathymetry data were processed using QINSy, Qimera, and ArcGIS. This study shows the capability of multibeam data to differentiate the four types of sediments which are i) very coarse sand, ii) coarse sand, iii) very coarse silt and coarse silt. The accuracy was reported as 92.31% overall accuracy and 0.88 kappa coefficient.

  17. Automatic detection of ionospheric Alfvén resonances using signal and image processing techniques

    Directory of Open Access Journals (Sweden)

    C. D. Beggan

    2014-08-01

    Full Text Available Induction coils permit the measurement of small and very rapid changes of the magnetic field. A new set of induction coils in the UK (at L = 3.2 record magnetic field changes over an effective frequency range of 0.1–40 Hz, encompassing phenomena such as the Schumann resonances, magnetospheric pulsations and ionospheric Alfvén resonances (IARs. The IARs typically manifest themselves as a series of spectral resonance structures (SRSs within the 1–10 Hz frequency range, usually appearing as fine bands or fringes in spectrogram plots and occurring almost daily during local night-time, disappearing during the daylight hours. The behaviour of the occurrence in frequency (f and the difference in frequency between fringes (Δf varies throughout the year. In order to quantify the daily, seasonal and annual changes of the SRSs, we developed a new method based on signal and image processing techniques to identify the fringes and to quantify the values of f, Δf and other relevant parameters in the data set. The technique is relatively robust to noise though requires tuning of threshold parameters. We analyse 18 months of induction coil data to demonstrate the utility of the method.

  18. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis.

    Science.gov (United States)

    Toledo, Cíntia Matsuda; Cunha, Andre; Scarton, Carolina; Aluísio, Sandra

    2014-01-01

    Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario. The aims were to describe how to:(i) develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and(ii) automatically identify the features that best distinguish the groups. The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age). In this study, the descriptions by 144 of the subjects studied in Toledo 18 were used,which included 200 healthy Brazilians of both genders. A Support Vector Machine (SVM) with a radial basis function (RBF) kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS) is a strong candidate to replace manual feature selection methods.

  19. Automatic classification of written descriptions by healthy adults: An overview of the application of natural language processing and machine learning techniques to clinical discourse analysis

    Directory of Open Access Journals (Sweden)

    Cíntia Matsuda Toledo

    Full Text Available Discourse production is an important aspect in the evaluation of brain-injured individuals. We believe that studies comparing the performance of brain-injured subjects with that of healthy controls must use groups with compatible education. A pioneering application of machine learning methods using Brazilian Portuguese for clinical purposes is described, highlighting education as an important variable in the Brazilian scenario.OBJECTIVE: The aims were to describe how to: (i develop machine learning classifiers using features generated by natural language processing tools to distinguish descriptions produced by healthy individuals into classes based on their years of education; and (ii automatically identify the features that best distinguish the groups.METHODS: The approach proposed here extracts linguistic features automatically from the written descriptions with the aid of two Natural Language Processing tools: Coh-Metrix-Port and AIC. It also includes nine task-specific features (three new ones, two extracted manually, besides description time; type of scene described - simple or complex; presentation order - which type of picture was described first; and age. In this study, the descriptions by 144 of the subjects studied in Toledo18 were used, which included 200 healthy Brazilians of both genders.RESULTS AND CONCLUSION:A Support Vector Machine (SVM with a radial basis function (RBF kernel is the most recommended approach for the binary classification of our data, classifying three of the four initial classes. CfsSubsetEval (CFS is a strong candidate to replace manual feature selection methods.

  20. NPOESS Interface Data Processing Segment (IDPS) Hardware

    Science.gov (United States)

    Sullivan, W. J.; Grant, K. D.; Bergeron, C.

    2008-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The NPOESS design allows centralized mission management and delivers high quality environmental products to military, civil and scientific users. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. IDPS processes NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. IDPS will process environmental data products beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. Within the overall NPOESS processing environment, the IDPS must process a data volume several orders of magnitude the size of current systems -- in one-quarter of the time. Further, it must support the calibration, validation, and data quality improvement initiatives of the NPOESS program to ensure the production of atmospheric and environmental products that meet strict requirements for accuracy and precision. This poster will illustrate and describe the IDPS HW architecture that is necessary to meet these challenging design requirements. In addition, it will illustrate the expandability features of the architecture in support of future data processing and data distribution needs.

  1. Field practice, data acquisition and processing

    International Nuclear Information System (INIS)

    Hignett, C.T.; Cunningham, R.B.; Dunin, F.X.

    1981-01-01

    Suggestions based on currrent practice in the field use of the neutron meter are presented with a view to minimising the risk of error. Recommended procedures for the collection, processing and analysis of data are outlined

  2. Rapid, semi-automatic fracture and contact mapping for point clouds, images and geophysical data

    Directory of Open Access Journals (Sweden)

    S. T. Thiele

    2017-12-01

    Full Text Available The advent of large digital datasets from unmanned aerial vehicle (UAV and satellite platforms now challenges our ability to extract information across multiple scales in a timely manner, often meaning that the full value of the data is not realised. Here we adapt a least-cost-path solver and specially tailored cost functions to rapidly interpolate structural features between manually defined control points in point cloud and raster datasets. We implement the method in the geographic information system QGIS and the point cloud and mesh processing software CloudCompare. Using these implementations, the method can be applied to a variety of three-dimensional (3-D and two-dimensional (2-D datasets, including high-resolution aerial imagery, digital outcrop models, digital elevation models (DEMs and geophysical grids. We demonstrate the algorithm with four diverse applications in which we extract (1 joint and contact patterns in high-resolution orthophotographs, (2 fracture patterns in a dense 3-D point cloud, (3 earthquake surface ruptures of the Greendale Fault associated with the Mw7.1 Darfield earthquake (New Zealand from high-resolution light detection and ranging (lidar data, and (4 oceanic fracture zones from bathymetric data of the North Atlantic. The approach improves the consistency of the interpretation process while retaining expert guidance and achieves significant improvements (35–65 % in digitisation time compared to traditional methods. Furthermore, it opens up new possibilities for data synthesis and can quantify the agreement between datasets and an interpretation.

  3. USING A DIGITAL VIDEO CAMERA AS THE SMART SENSOR OF THE SYSTEM FOR AUTOMATIC PROCESS CONTROL OF GRANULAR FODDER MOLDING

    Directory of Open Access Journals (Sweden)

    M. M. Blagoveshchenskaya

    2014-01-01

    Full Text Available Summary. The most important operation of granular mixed fodder production is molding process. Properties of granular mixed fodder are defined during this process. They determine the process of production and final product quality. The possibility of digital video camera usage as intellectual sensor for control system in process of production is analyzed in the article. The developed parametric model of the process of bundles molding from granular fodder mass is presented in the paper. Dynamic characteristics of the molding process were determined. A mathematical model of motion of bundle of granular fodder mass after matrix holes was developed. The developed mathematical model of the automatic control system (ACS with the use of etalon video frame as the set point in the MATLAB software environment was shown. As a parameter of the bundles molding process it is proposed to use the value of the specific area defined in the mathematical treatment of the video frame. The algorithms of the programs to determine the changes in structural and mechanical properties of the feed mass in video frames images were developed. Digital video shooting of various modes of the molding machine was carried out and after the mathematical processing of video the transfer functions for use as a change of adjustable parameters of the specific area were determined. Structural and functional diagrams of the system of regulation of the food bundles molding process with the use of digital camcorders were built and analyzed. Based on the solution of the equations of fluid dynamics mathematical model of bundle motion after leaving the hole matrix was obtained. In addition to its viscosity, creep property was considered that is characteristic of the feed mass. The mathematical model ACS of the bundles molding process allowing to investigate transient processes which occur in the control system that uses a digital video camera as the smart sensor was developed in Simulink

  4. USING AFFORDABLE DATA CAPTURING DEVICES FOR AUTOMATIC 3D CITY MODELLING

    Directory of Open Access Journals (Sweden)

    B. Alizadehashrafi

    2017-11-01

    Full Text Available In this research project, many movies from UTM Kolej 9, Skudai, Johor Bahru (See Figure 1 were taken by AR. Drone 2. Since the AR drone 2.0 has liquid lens, while flying there were significant distortions and deformations on the converted pictures of the movies. Passive remote sensing (RS applications based on image matching and Epipolar lines such as Agisoft PhotoScan have been tested to create the point clouds and mesh along with 3D models and textures. As the result was not acceptable (See Figure 2, the previous Dynamic Pulse Function based on Ruby programming language were enhanced and utilized to create the 3D models automatically in LoD3. The accuracy of the final 3D model is almost 10 to 20 cm. After rectification and parallel projection of the photos based on some tie points and targets, all the parameters were measured and utilized as an input to the system to create the 3D model automatically in LoD3 in a very high accuracy.

  5. Using Acceleration Data to Automatically Detect the Onset of Farrowing in Sows

    Directory of Open Access Journals (Sweden)

    Imke Traulsen

    2018-01-01

    Full Text Available The aim of the present study was to automatically predict the onset of farrowing in crate-confined sows. (1 Background: Automatic tools are appropriate to support animal surveillance under practical farming conditions. (2 Methods: In three batches, sows in one farrowing compartment of the Futterkamp research farm were equipped with an ear sensor to sample acceleration. As a reference video, recordings of the sows were used. A classical CUSUM chart using different acceleration indices of various distribution characteristics with several scenarios were compared. (3 Results: The increase of activity mainly due to nest building behavior before the onset of farrowing could be detected with the sow individual CUSUM chart. The best performance required a statistical distribution characteristic that represented fluctuations in the signal (for example, 1st variation combined with a transformation of this parameter by cumulating differences in the signal within certain time periods from one day to another. With this transformed signal, farrowing sows could reliably be detected. For 100% or 85% of the sows, an alarm was given within 48 or 12 h before the onset of farrowing. (4 Conclusions: Acceleration measurements in the ear of a sow are suitable for detecting the onset of farrowing in individually housed sows in commercial farrowing crates.

  6. Using Affordable Data Capturing Devices for Automatic 3d City Modelling

    Science.gov (United States)

    Alizadehashrafi, B.; Abdul-Rahman, A.

    2017-11-01

    In this research project, many movies from UTM Kolej 9, Skudai, Johor Bahru (See Figure 1) were taken by AR. Drone 2. Since the AR drone 2.0 has liquid lens, while flying there were significant distortions and deformations on the converted pictures of the movies. Passive remote sensing (RS) applications based on image matching and Epipolar lines such as Agisoft PhotoScan have been tested to create the point clouds and mesh along with 3D models and textures. As the result was not acceptable (See Figure 2), the previous Dynamic Pulse Function based on Ruby programming language were enhanced and utilized to create the 3D models automatically in LoD3. The accuracy of the final 3D model is almost 10 to 20 cm. After rectification and parallel projection of the photos based on some tie points and targets, all the parameters were measured and utilized as an input to the system to create the 3D model automatically in LoD3 in a very high accuracy.

  7. Integrating Data Sources for Process Sustainability ...

    Science.gov (United States)

    To perform a chemical process sustainability assessment requires significant data about chemicals, process design specifications, and operating conditions. The required information includes the identity of the chemicals used, the quantities of the chemicals within the context of the sustainability assessment, physical properties of these chemicals, equipment inventory, as well as health, environmental, and safety properties of the chemicals. Much of this data are currently available to the process engineer either from the process design in the chemical process simulation software or online through chemical property and environmental, health, and safety databases. Examples of these databases include the U.S. Environmental Protection Agency’s (USEPA’s) Aggregated Computational Toxicology Resource (ACToR), National Institute for Occupational Safety and Health’s (NIOSH’s) Hazardous Substance Database (HSDB), and National Institute of Standards and Technology’s (NIST’s) Chemistry Webbook. This presentation will provide methods and procedures for extracting chemical identity and flow information from process design tools (such as chemical process simulators) and chemical property information from the online databases. The presentation will also demonstrate acquisition and compilation of the data for use in the EPA’s GREENSCOPE process sustainability analysis tool. This presentation discusses acquisition of data for use in rapid LCI development.

  8. Interactive data-processing system for metallurgy

    Science.gov (United States)

    Rathz, T. J.

    1978-01-01

    Equipment indicates that system can rapidly and accurately process metallurgical and materials-processing data for wide range of applications. Advantages include increase in contract between areas on image, ability to analyze images via operator-written programs, and space available for storing images.

  9. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  10. Automatic adjustment of cycle length and aeration time for improved nitrogen removal in an alternating activated sludge process

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard

    1997-01-01

    The paper examines the nitrogen dynamics in the alternating BIODENITRO and BIODENIPHO processes with a focus on two control handles influencing now scheduling and aeration: the cycle length and the ammonia concentration at which a nitrifying period is terminated. A steady state analysis examining...... on a cycle by cycle single variable optimization rather than a more difficult multivariable optimization over a longer time horizon. Copyright (C) 1996 IAWQ.......The paper examines the nitrogen dynamics in the alternating BIODENITRO and BIODENIPHO processes with a focus on two control handles influencing now scheduling and aeration: the cycle length and the ammonia concentration at which a nitrifying period is terminated. A steady state analysis examining...... the combined effect of both control handles reveals three conditions which characterize when nitrogen removal is maximized. A dynamic analysis shows that these conditions also apply for a changing ammonia load. This then allows an automatic control strategy for maximizing nitrogen removal to be based...

  11. Automatic evaluation of the X-ray film qualities by means of the image processing. 1. Evaluation based upon the density distribution

    International Nuclear Information System (INIS)

    Kobayashi, Nobuo

    1995-01-01

    The automatic evaluation of the image qualities is attempted onto the X-ray film by means of the image processings. The basic structures of the image are investigated through the color decomposition in the image. Based upon the density distributions the position and the sensitivity of the penetrameter and the density of the image qualifier are calculated automatically, whose results show the good agreements with that of the manual measurements. (author)

  12. NPOESS Interface Data Processing Segment Product Generation

    Science.gov (United States)

    Grant, K. D.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The NPOESS design allows centralized mission management and delivers high quality environmental products to military, civil and scientific users. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The IDPS will process environmental data products beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. Within the overall NPOESS processing environment, the IDPS must process a data volume nearly 1000 times the size of current systems -- in one-quarter of the time. Further, it must support the calibration, validation, and data quality improvement initiatives of the NPOESS program to ensure the production of atmospheric and environmental products that meet strict requirements for accuracy and precision. This paper will describe the architecture approach that is necessary to meet these challenging, and seemingly exclusive, NPOESS IDPS design requirements, with a focus on the processing relationships required to generate the NPP products.

  13. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  14. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  15. Hybrid processing of laser scanning data

    Science.gov (United States)

    Badenko, Vladimir; Zotov, Dmitry; Fedotov, Alexander

    2018-03-01

    In this article the analysis of gaps in processing of raw laser scanning data and results of bridging the gaps discovered on the base of usage of laser scanning data for historic building information modeling is presented. The results of the development of a unified hybrid technology for the processing, storage, access and visualization of combined laser scanning and photography data about historical buildings are analyzed. The first result of the technology application for the historical building of St. Petersburg Polytechnic University shows reliability of the proposed approaches.

  16. Data Refining for Text Mining Process in Aviation Safety Data

    Science.gov (United States)

    Sjöblom, Olli

    Successful data mining is an iterative process during which data will be refined and adjusted to achieve more accurate mining results. Most important tools in the text mining context are list of stop words and list of synonyms. The size and richness of the lists mentioned depend on the structure of the language used in the text to be mined. English, for example, is an “easy” language for search technologies, because with a couple of exceptions, the stem of the word is not conjugated and terms are formed using several words instead of creating compounds. This requires special attention to definitions when processing morphologically rich languages like Finnish. This chapter introduces the need and realisation of refining the source data for a successful data mining process based onto the results achieved from first mining round.

  17. Acquisition and processing of proportional chamber data

    International Nuclear Information System (INIS)

    Kozhevnikov, Yu.A.

    1987-01-01

    A data acquisition unit for proportional chambers is described which can select data belonging to individual groups of simultaneously triggered adjacent channels (clusters) and encodes them into a format suitable for further processing. The unit is built as a standard CAMAC module of double width, was designed to operate with hardware reading significant data only, and can serve up to 8192 detection channels. The unit can be tested independently and can operate with an external storage without connection to the crate bus. Two types of errors associated with data reception are detected and diagnosed in the course of data acquisition

  18. Digital processing of ionospheric electron content data

    Science.gov (United States)

    Bernhardt, P. A.

    1979-01-01

    Ionospheric electron content data contain periodicities that are produced by a diversity of sources including hydromagnetic waves, gravity waves, and lunar tides. Often these periodicities are masked by the strong daily variation in the data. Digital filtering can be used to isolate the weaker components. The filtered data can then be further processed to provide estimates of the source properties. In addition, homomorphic filtering may be used to identify nonlinear interactions in the ionosphere.

  19. Extending Ozone Data Processing to the Community

    Science.gov (United States)

    Tilmes, C. A.; Durbin, P. B.; Soika, V.; Martin, A. T.

    2006-12-01

    The Ozone Monitoring Instrument Science Investigator-led Processing System (OMI SIPS) has been the central data processing system for OMI since its launch on NASA's Aura spacecraft in July, 2005. As part of NASA's evolution from mission based processing to measurement based processing, we are evolving the system into a Community Oriented Measurement-based Processing System (ComPS). This involves changing focus from the mission (OMI) to the measurement (total column ozone), and a widening of our focus from the mission science teams to the overall scientific community. The current system takes software developed by scientists, dispatches and executes the software on a compute cluster, archives the results and distributes them to numerous parties. Although this works well for the production environment, access to centralized system has been naturally limited. Ideally, scientists should be able to easily get the data, run their software, make changes and repeat the process until they are happy with the solution to the problems they are trying to solve. In addition it should be simple to migrate research improvements from the community back into the formal production system. Through NASA's "Advancing Collaborative Connections for Earth-Sun System Science," we have extended publicly accessible interfaces into the production system. The system provides an open API via a set of SOAP/XML and REST based web services, enabling scientists, researchers and operators to interact directly with the data and services offered by the central system. The system includes metadata, archive, and planner subsystems. The metadata server stores metadata and provides the ability for processing software to evaluate production rules to determine the appropriate input data files for a given data processing job. The archive server stores the data files themselves and makes it easy for clients to retrieve the files as needed. The planner plans out the set of jobs to be run in the production

  20. Validation of the ICU-DaMa tool for automatically extracting variables for minimum dataset and quality indicators: The importance of data quality assessment.

    Science.gov (United States)

    Sirgo, Gonzalo; Esteban, Federico; Gómez, Josep; Moreno, Gerard; Rodríguez, Alejandro; Blanch, Lluis; Guardiola, Juan José; Gracia, Rafael; De Haro, Lluis; Bodí, María

    2018-04-01

    Big data analytics promise insights into healthcare processes and management, improving outcomes while reducing costs. However, data quality is a major challenge for reliable results. Business process discovery techniques and an associated data model were used to develop data management tool, ICU-DaMa, for extracting variables essential for overseeing the quality of care in the intensive care unit (ICU). To determine the feasibility of using ICU-DaMa to automatically extract variables for the minimum dataset and ICU quality indicators from the clinical information system (CIS). The Wilcoxon signed-rank test and Fisher's exact test were used to compare the values extracted from the CIS with ICU-DaMa for 25 variables from all patients attended in a polyvalent ICU during a two-month period against the gold standard of values manually extracted by two trained physicians. Discrepancies with the gold standard were classified into plausibility, conformance, and completeness errors. Data from 149 patients were included. Although there were no significant differences between the automatic method and the manual method, we detected differences in values for five variables, including one plausibility error and two conformance and completeness errors. Plausibility: 1) Sex, ICU-DaMa incorrectly classified one male patient as female (error generated by the Hospital's Admissions Department). Conformance: 2) Reason for isolation, ICU-DaMa failed to detect a human error in which a professional misclassified a patient's isolation. 3) Brain death, ICU-DaMa failed to detect another human error in which a professional likely entered two mutually exclusive values related to the death of the patient (brain death and controlled donation after circulatory death). Completeness: 4) Destination at ICU discharge, ICU-DaMa incorrectly classified two patients due to a professional failing to fill out the patient discharge form when thepatients died. 5) Length of continuous renal replacement

  1. Automatic detection of subglacial lakes in radar sounder data acquired in Antarctica

    Science.gov (United States)

    Ilisei, Ana-Maria; Khodadadzadeh, Mahdi; Dalsasso, Emanuele; Bruzzone, Lorenzo

    2017-10-01

    Subglacial lakes decouple the ice sheet from the underlying bedrock, thus facilitating the sliding of the ice masses towards the borders of the continents, consequently raising the sea level. This motivated increasing attention in the detection of subglacial lakes. So far, about 70% of the total number of subglacial lakes in Antarctica have been detected by analysing radargrams acquired by radar sounder (RS) instruments. Although the amount of radargrams is expected to drastically increase, from both airborne and possible future Earth observation RS missions, currently the main approach to the detection of subglacial lakes in radargrams is by visual interpretation. This approach is subjective and extremely time consuming, thus difficult to apply to a large amount of radargrams. In order to address the limitations of the visual interpretation and to assist glaciologists in better understanding the relationship between the subglacial environment and the climate system, in this paper, we propose a technique for the automatic detection of subglacial lakes. The main contribution of the proposed technique is the extraction of features for discriminating between lake and non-lake basal interfaces. In particular, we propose the extraction of features that locally capture the topography of the basal interface, the shape and the correlation of the basal waveforms. Then, the extracted features are given as input to a supervised binary classifier based on Support Vector Machine to perform the automatic subglacial lake detection. The effectiveness of the proposed method is proven both quantitatively and qualitatively by applying it to a large dataset acquired in East Antarctica by the MultiChannel Coherent Radar Depth Sounder.

  2. Automatic first-break picking using the instantaneous traveltime attribute

    KAUST Repository

    Saragiotis, Christos

    2012-01-01

    Picking the first breaks is an important step in seismic processing. The large volume of the seismic data calls for automatic and objective picking. We introduce a new automatic first-break picker, which uses specifically designed time windows and an iterative procedure based on the instantaneous traveltime attribute. The method is fast as it only uses a few FFT\\'s per trace. We demonstrate the effectiveness of this automatic method by applying it on real test data.

  3. Information-management data base for fusion-target fabrication processes

    International Nuclear Information System (INIS)

    Reynolds, J.

    1982-01-01

    A computer-based data-management system has been developed to handle data associated with target-fabrication processes including glass microballoon characterization, gas filling, materials coating, and storage locations. The system provides automatic data storage and computation, flexible data-entry procedures, fast access, automated report generation, and secure data transfer. It resides on a CDC CYBER 175 computer and is compatible with the CDC data-base-language Query Update, but is based on custom FORTRAN software interacting directly with the CYBER's file-management system. The described data base maintains detailed, accurate, and readily available records of fusion targets information

  4. Performance assessment of a data processing chain for THz imaging

    Science.gov (United States)

    Catapano, Ilaria; Ludeno, Giovanni; Soldovieri, Francesco

    2017-04-01

    Nowadays, TeraHertz (THz) imaging is deserving huge attention as very high resolution diagnostic tool in many applicative fields, among which security, cultural heritage, material characterization and civil engineering diagnostics. This widespread use of THz waves is due to their non-ionizing nature, their capability of penetrating into non-metallic opaque materials, as well as to the technological advances, which have allowed the commercialization of compact, flexible and portable systems. However, the effectiveness of THz imaging depends strongly on the adopted data processing aimed at improving the imaging performance of the hardware device. In particular, data processing is required to mitigate detrimental and unavoidable effects like noise, signal attenuation, as well as to correct the sample surface topography. With respect to data processing, we have proposed recently a strategy involving three different steps aimed at reducing noise, filtering out undesired signal introduced by the adopted THz system and performing surface topography correction [1]. The first step regards noise filtering and exploits a procedure based on the Singular Value Decomposition (SVD) [2] of the data matrix, which does not require knowledge of noise level and it does not involve the use of a reference signal. The second step aims at removing the undesired signal that we have experienced to be introduced by the adopted Z-Omega Fiber-Coupled Terahertz Time Domain (FICO) system. Indeed, when the system works in a high-speed mode, an undesired low amplitude peak occurs always at the same time instant from the beginning of the observation time window and needs to be removed from the useful data matrix in order to avoid a wrong interpretation of the imaging results. The third step of the considered data processing chain is a topographic correction, which needs in order to image properly the samples surface and its inner structure. Such a procedure performs an automatic alignment of the

  5. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  6. Optimizing ISOCAM data processing using spatial redundancy

    Science.gov (United States)

    Miville-Deschênes, M.-A.; Boulanger, F.; Abergel, A.; Bernard, J.-P.

    2000-11-01

    Several instrumental effects of the Long Wavelength channel of ISOCAM, the camera on board the Infrared Space Observatory, degrade the processed images. We present new data-processing techniques that correct these defects, taking advantage of the fact that a position in the sky has been observed by several pixels at different times. We use this redundant information (1) to correct the long-term variation of the detector response, (2) to correct memory effects after glitches and point sources, and (3) to refine the deglitching process. As an example we have applied our processing to the gamma-ray burst observation GRB 970402. Our new data-processing techniques allow the detection of faint extended emission with contrast smaller than 1% of the zodiacal background. The data reduction corrects instrumental effects to the point where the noise in the final map is dominated by the readout and the photon noises. All raster ISOCAM observations can benefit from the data processing described here. This includes mapping of solar system extended objects (comet dust trails), nearby clouds and star forming regions, images from diffuse emission in the Galactic plane and external galaxies. These techniques could also be applied to other raster type observations (e.g. ISOPHOT). Based on observations with ISO, an ESA project with instruments funded by ESA Member States (especially the PI countries: France, Germany, The Netherlands and the UK) and with the participation of ISAS and NASA.

  7. Data processing with Pymicra, the Python tool for Micrometeorological Analyses

    Science.gov (United States)

    Chor, T. L.; Dias, N. L.

    2017-12-01

    With the ever-increasing capability of instrumentation of collecting high-frequency turbulence data, micrometeorological experiments are now generating significant amounts of data. Clearly, data processing -- and not data collection anymore -- has become the limiting factor for those very large data sets. The ability of extracting useful scientific information from those experiments, therefore, hinges on tools that (i) are able to process those data effectively and accurately, (ii) are flexible enough to be adapted to the specific requirements of each investigation, and (iii) are robust enough to make data analysis easily reproducible over different sets of large data sets. We have developed a framework for micrometeorological data analysis called Pymicra which does deliver such capabilities while maintaining proximity of the investigator with the data. It is fully written in an open-source, very high level language, Python, which has been gaining widespread acceptance as a scientific tool. It follows the philosophy of "not reinventing the wheel" and, as a result, relies on existing well-established open-source Python packages such as Numpy and Pandas. Thus, minimum effort is needed to program statistics, array processing, Fourier analysis, etc. Among the things that Pymicra does are reading and organizing data from virtually any format, applying common quality control procedures, extracting fluctuations in a number of ways, correcting for sensor drift, automatic calculation of fluid properties (such as air and dry air density), handling of units, calculation of cross-spectra, calculation of turbulent fluxes and scales, and all other features already provided by Pandas (interpolation, statistical tests, handling of missing data, etc.). Pymicra is freely available on Github and the fact that it makes heavy use of high-level programming makes adding and modifying code considerably easy for any scientific programmer, making it straightforward for other scientists to

  8. Processing of logging data from nuclear tools

    International Nuclear Information System (INIS)

    Oelgaard, P.L.; Petersen, R.

    1988-07-01

    When raw data, e.g. counts or count rates, have been obtained with nuclear logging tools, they have to be processed in order to yield the desired formation properties. Little information is available on this processing. However, it is necessary to understand the processing procedure in order to be able to evaluate its physical correctness, and to understand which corrections are involved. For this reason an analysis has been performed of two sets of field data obtained with neutron porosity and gamma density tools and one data set obtained with a pulsed-neutron capture-gamma tool. Through this analysis it is believed that full insight in the processing of data from the neutron porosity and the gamma density tools has been achieved. The same is not quite the case for the pulsed neutron capture-gamma tool, possibly due to lack of the necessary data. The analysis has also revealed doubt about the physical correctness of some of the features of the processing procedure. (author)

  9. Parkinson's disease disrupts both automatic and controlled processing of action verbs.

    Science.gov (United States)

    Fernandino, Leonardo; Conant, Lisa L; Binder, Jeffrey R; Blindauer, Karen; Hiner, Bradley; Spangler, Katie; Desai, Rutvik H

    2013-10-01

    The problem of how word meaning is processed in the brain has been a topic of intense investigation in cognitive neuroscience. While considerable correlational evidence exists for the involvement of sensory-motor systems in conceptual processing, it is still unclear whether they play a causal role. We investigated this issue by comparing the performance of patients with Parkinson's disease (PD) with that of age-matched controls when processing action and abstract verbs. To examine the effects of task demands, we used tasks in which semantic demands were either implicit (lexical decision and priming) or explicit (semantic similarity judgment). In both tasks, PD patients' performance was selectively impaired for action verbs (relative to controls), indicating that the motor system plays a more central role in the processing of action verbs than in the processing of abstract verbs. These results argue for a causal role of sensory-motor systems in semantic processing. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Automatic tracking of dynamical evolutions of oceanic mesoscale eddies with satellite observation data

    Science.gov (United States)

    Sun, Liang; Li, Qiu-Yang

    2017-04-01

    The oceanic mesoscale eddies play a major role in ocean climate system. To analyse spatiotemporal dynamics of oceanic mesoscale eddies, the Genealogical Evolution Model (GEM) based on satellite data is developed, which is an efficient logical model used to track dynamic evolution of mesoscale eddies in the ocean. It can distinguish different dynamic processes (e.g., merging and splitting) within a dynamic evolution pattern, which is difficult to accomplish using other tracking methods. To this end, a mononuclear eddy detection method was firstly developed with simple segmentation strategies, e.g. watershed algorithm. The algorithm is very fast by searching the steepest descent path. Second, the GEM uses a two-dimensional similarity vector (i.e. a pair of ratios of overlap area between two eddies to the area of each eddy) rather than a scalar to measure the similarity between eddies, which effectively solves the ''missing eddy" problem (temporarily lost eddy in tracking). Third, for tracking when an eddy splits, GEM uses both "parent" (the original eddy) and "child" (eddy split from parent) and the dynamic processes are described as birth and death of different generations. Additionally, a new look-ahead approach with selection rules effectively simplifies computation and recording. All of the computational steps are linear and do not include iteration. Given the pixel number of the target region L, the maximum number of eddies M, the number N of look-ahead time steps, and the total number of time steps T, the total computer time is O (LM(N+1)T). The tracking of each eddy is very smooth because we require that the snapshots of each eddy on adjacent days overlap one another. Although eddy splitting or merging is ubiquitous in the ocean, they have different geographic distribution in the Northern Pacific Ocean. Both the merging and splitting rates of the eddies are high, especially at the western boundary, in currents and in "eddy deserts". GEM is useful not only for

  11. Automatic Test Data Generation Using Data Flow Information = Veri Akışı Bilgisi Kullanılarak Otomatik Test Verisi Üretimi

    Directory of Open Access Journals (Sweden)

    Rana ABDELAZIZ

    2000-06-01

    Full Text Available This paper presents a tool for automatically generating test data for Pascal programs that satisfy the data flow criteria. Unlike existing tools, our tool is not limited to Pascal programs whose program flow graph contains read statements in only one node but rather deals with read statements appearing in any node in the program flow graph. Moreover, our tool handles loops and arrays, these two features are traditionally difficult to handle in test data generation systems. This allows us to generate tests for larger programs than those previously reported in the literature.

  12. A fully automatic tool to perform accurate flood mapping by merging remote sensing imagery and ancillary data

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco; Pasquariello, Guido

    2016-04-01

    Flooding is one of the most frequent and expansive natural hazard. High-resolution flood mapping is an essential step in the monitoring and prevention of inundation hazard, both to gain insight into the processes involved in the generation of flooding events, and from the practical point of view of the precise assessment of inundated areas. Remote sensing data are recognized to be useful in this respect, thanks to the high resolution and regular revisit schedules of state-of-the-art satellites, moreover offering a synoptic overview of the extent of flooding. In particular, Synthetic Aperture Radar (SAR) data present several favorable characteristics for flood mapping, such as their relative insensitivity to the meteorological conditions during acquisitions, as well as the possibility of acquiring independently of solar illumination, thanks to the active nature of the radar sensors [1]. However, flood scenarios are typical examples of complex situations in which different factors have to be considered to provide accurate and robust interpretation of the situation on the ground: the presence of many land cover types, each one with a particular signature in presence of flood, requires modelling the behavior of different objects in the scene in order to associate them to flood or no flood conditions [2]. Generally, the fusion of multi-temporal, multi-sensor, multi-resolution and/or multi-platform Earth observation image data, together with other ancillary information, seems to have a key role in the pursuit of a consistent interpretation of complex scenes. In the case of flooding, distance from the river, terrain elevation, hydrologic information or some combination thereof can add useful information to remote sensing data. Suitable methods, able to manage and merge different kind of data, are so particularly needed. In this work, a fully automatic tool, based on Bayesian Networks (BNs) [3] and able to perform data fusion, is presented. It supplies flood maps

  13. Automatic Generation of Assembly Sequence for the Planning of Outfitting Processes in Shipbuilding

    NARCIS (Netherlands)

    Wei, Y.

    2012-01-01

    The most important characteristics of the outfitting processes in shipbuilding are: 1. The processes involve many interferences between yard and different subcontractors. In recent years, the use of outsourcing and subcontracting has become a widespread strategy of western shipyards. There exists

  14. Task Versus Component Consistency in the Development of Automatic Processes: Consistent Attending Versus Consistent Responding.

    Science.gov (United States)

    1982-03-01

    Norman, D. A. Me and attention: An introduction to human information processing:, (2nd Ed.). New York: Wiley and Sons, 1976. Posner, M. I., & Snyder, C. R...Cognitive Processes, NSF, Washington, DC J. Anderson, Psychology Dept., Carnegie Mellon Univ., Pittsburgh, PA J. Annett, Pschology Dept., Univ. of

  15. Data processing system of GA and PPPL

    International Nuclear Information System (INIS)

    Oshima, Takayuki

    2001-11-01

    Results of research in 1997 to General Atomics (GA) and Princeton Plasma Physics Laboratory (PPPL) are reported. The author visited the computer system of fusion group in GA. He joined the tokamak experiment in DIII-D, especially on the demonstration of the remote experiment inside U.S., and investigated the data processing system of DIII-D and the computer network, etc. After the visit to GA, He visited PPPL and exchanged the information about the equipment of remote experiment between JAERI and PPPL based on the US-Japan fusion energy research cooperation. He also investigated the data processing system of TFTR tokamak, the computer network and so on. Results of research of the second visit to GA in 2000 are also reported, which describes a rapid progress of each data processing equipment by the advance on the computer technology in just three years. (author)

  16. Gas chromatography - mass spectrometry data processing made easy.

    Science.gov (United States)

    Johnsen, Lea G; Skou, Peter B; Khakimov, Bekzod; Bro, Rasmus

    2017-06-23

    Evaluation of GC-MS data may be challenging due to the high complexity of data including overlapped, embedded, retention time shifted and low S/N ratio peaks. In this work, we demonstrate a new approach, PARAFAC2 based Deconvolution and Identification System (PARADISe), for processing raw GC-MS data. PARADISe is a computer platform independent freely available software incorporating a number of newly developed algorithms in a coherent framework. It offers a solution for analysts dealing with complex chromatographic data. It allows extraction of chemical/metabolite information directly from the raw data. Using PARADISe requires only few inputs from the analyst to process GC-MS data and subsequently converts raw netCDF data files into a compiled peak table. Furthermore, the method is generally robust towards minor variations in the input parameters. The method automatically performs peak identification based on deconvoluted mass spectra using integrated NIST search engine and generates an identification report. In this paper, we compare PARADISe with AMDIS and ChromaTOF in terms of peak quantification and show that PARADISe is more robust to user-defined settings and that these are easier (and much fewer) to set. PARADISe is based on non-proprietary scientifically evaluated approaches and we here show that PARADISe can handle more overlapping signals, lower signal-to-noise peaks and do so in a manner that requires only about an hours worth of work regardless of the number of samples. We also show that there are no non-detects in PARADISe, meaning that all compounds are detected in all samples. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Facilitating goal-oriented behaviour in the Stroop task: when executive control is influenced by automatic processing.

    Science.gov (United States)

    Parris, Benjamin A; Bate, Sarah; Brown, Scott D; Hodgson, Timothy L

    2012-01-01

    A portion of Stroop interference is thought to arise from a failure to maintain goal-oriented behaviour (or goal neglect). The aim of the present study was to investigate whether goal- relevant primes could enhance goal maintenance and reduce the Stroop interference effect. Here it is shown that primes related to the goal of responding quickly in the Stroop task (e.g. fast, quick, hurry) substantially reduced Stroop interference by reducing reaction times to incongruent trials but increasing reaction times to congruent and neutral trials. No effects of the primes were observed on errors. The effects on incongruent, congruent and neutral trials are explained in terms of the influence of the primes on goal maintenance. The results show that goal priming can facilitate goal-oriented behaviour and indicate that automatic processing can modulate executive control.

  18. An investigation of the Stroop effect among deaf signers in English and Japanese: automatic processing or memory retrieval?

    Science.gov (United States)

    Flaherty, Mary; Moran, Aidan

    2007-01-01

    Most studies on the Stroop effect (unintentional automatic word processing) have been restricted to English speakers using vocal responses. Little is known about this effect with deaf signers. The study compared Stroop task responses among four different samples: deaf participants from a Japanese-language environment and from an English-language environment; and hearing individuals from Japan and from Australia. Color words were prepared in both English and Japanese and were presented in three conditions: congruent (e.g., the word red printed in red), incongruent (e.g., red printed in blue), and neutral. The magnitude of the effect was greater with the deaf participants than with the hearing participants. The deaf individuals experienced more interference in English than in Japanese.

  19. Data collection and processing for the ACES

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, J.L.; Miller, D.R.

    1981-08-01

    The Annual Cycle Energy System demonstration house furnishes information that is collected, processed, and analyzed on a weekly schedule. The computer codes used for processing and analyses were designed to display collected data; to summarize the performance (mechanical) of the house for each week; to give representation of external influences such as temperature, humidity ratio, and wind speed; and to aid in the dissemination of data to other users. Revisions and adjustments have been made to the codes to accommodate improvements made at the demonstration facility. The codes are written in either FORTRAN IV or Pl/I programming languages. All programs in the system run on the IBM 360 systems.

  20. An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters.

    Science.gov (United States)

    Behrens, F; Mackeben, M; Schröder-Preikschat, W

    2010-08-01

    This analysis of time series of eye movements is a saccade-detection algorithm that is based on an earlier algorithm. It achieves substantial improvements by using an adaptive-threshold model instead of fixed thresholds and using the eye-movement acceleration signal. This has four advantages: (1) Adaptive thresholds are calculated automatically from the preceding acceleration data for detecting the beginning of a saccade, and thresholds are modified during the saccade. (2) The monotonicity of the position signal during the saccade, together with the acceleration with respect to the thresholds, is used to reliably determine the end of the saccade. (3) This allows differentiation between saccades following the main-sequence and non-main-sequence saccades. (4) Artifacts of various kinds can be detected and eliminated. The algorithm is demonstrated by applying it to human eye movement data (obtained by EOG) recorded during driving a car. A second demonstration of the algorithm detects microsleep episodes in eye movement data.