WorldWideScience

Sample records for automatic data collection systems

  1. Automatic Weissenberg data collection system for time-resolved protein crystallography

    CERN Document Server

    Sakabe, N; Higashi, T; Igarashi, N; Suzuki, M; Watanabe, N; Sasaki, K

    2001-01-01

    A totally new type of fully automatic Weissenberg data-collection system called 'Galaxy' was developed and was installed at the Photon Factory. This automatic data collection system consists of a rotated-inclined focusing monochromator, a screenless Weissenberg type camera, an image reader, an eraser, a cassette transportation mechanism, a control console and a safety and high-speed computer network system linking a control console, data processing computers and data servers. The special characteristics of this system are a Weissenberg camera with a fully cylindrical cassette which can be rotated to exchange a frame, a maximum number of 36 images to be recorded in an IP cassette, and a very high speed IP reader with five reading heads. Since the frame exchange time is only a few seconds, this system is applicable for time-resolved protein crystallography at seconds or minutes of time-scale.

  2. Automatic Identification And Data Collection Via Barcode Laser Scanning.

    Science.gov (United States)

    Jacobeus, Michel

    1986-07-01

    How to earn over 100 million a year by investing 40 million ? No this is not the latest Wall Street "tip" but the costsavings obtained by the U.S. Department of Defense. 2 % savings on annual turnover claim supermarkets ! Millions of Dollars saved report automotive companies ! These are not daydreams, but tangible results measured by users after implemen-ting Automatic Identification and Data Collection systems, based on bar codes. To paraphrase the famous sentence "I think, thus I am", with AI/ADC systems "You knonw, thus you are". Indeed, in today's world, an immediate, accurate and precise information is a vital management need for companies growth and survival. AI/ADC techniques fullfill these objectives by supplying automatically and without any delay nor alteration the right information.

  3. Design and implementation of automatic color information collection system

    Science.gov (United States)

    Ci, Wenjie; Xie, Kai; Li, Tong

    2015-12-01

    In liquid crystal display (LCD) colorimetric characterization, it needs to convert RGB the device-dependent color space to CIEXYZ or CIELab the device-independent color space. Namely establishing the relationship between RGB and CIE using the data of device color and the corresponding data of CIE. Thus a color automatic message acquisition software is designed. We use openGL to fulfill the full screen display function, write c++ program and call the Eyeone equipment library functions to accomplish the equipment calibration, set the sample types, and realize functions such as sampling and preservation. The software can drive monitors or projectors display the set of sample colors automatically and collect the corresponding CIE values. The sample color of RGB values and the acquisition of CIE values can be stored in a text document, which is convenient for future extraction and analysis. Taking the cubic polynomial as an example, each channel is sampled of 17 sets using this system. And 100 sets of test data are also sampled. Using the least square method we can get the model. The average of color differences are around 2.4874, which is much lower than the CIE2000 commonly required level of 6.00.The successful implementation of the system saves the time of sample color data acquisition, and improves the efficiency of LCD colorimetric characterization.

  4. MAC, A System for Automatically IPR Identification, Collection and Distribution

    Science.gov (United States)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  5. 全自动航空伽玛能谱数据收录系统%The Automatic Data Collection System of Airborne Gamma

    Institute of Scientific and Technical Information of China (English)

    魏林; 曾国强; 葛良全

    2012-01-01

    介绍了以研华ACP - 4001型号一体化工控机为平台研发的全自动航空伽玛能谱数据收录系统.该系统通过GPS秒脉冲时钟完成全局时钟同步驱动.采用并口完成数据传输,串口获取GPS定位信息,PT100铂电阻温度传感器采集温度值.软件系统基于LabWindows/CVI与VisualC++6.0平台开发,完成数据采集底层动态链接库与友好界面的设计.实验结果表明:收录系统数据收录准确、快速、完整,达到设计目的.%The automatic data collection system of airborne gamma is researched based on the ACP -4001 model integrated industrial control computer. The system uses the PPS clock complete synchronization global drive. The system uses high - speed parallel port complete mass data transmission, serial ports for getting GPS positioning information, PT100 platinum resistance temperature sensors collect temperature. The software system which completes data acquisition bottom DLL and friendly interface design is based on Labwindows/CVI and Vi-sualC + + 6. 0 platform development. The experimental results show that the system collecte data accurate, fast, integrity and achieve the goals of design.

  6. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  7. ATCOM: Automatically Tuned Collective Communication System for SMP Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Meng-Shiou [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    Conventional implementations of collective communications are based on point-to-point communications, and their optimizations have been focused on efficiency of those communication algorithms. However, point-to-point communications are not the optimal choice for modern computing clusters of SMPs due to their two-level communication structure. In recent years, a few research efforts have investigated efficient collective communications for SMP clusters. This dissertation is focused on platform-independent algorithms and implementations in this area. There are two main approaches to implementing efficient collective communications for clusters of SMPs: using shared memory operations for intra-node communications, and overlapping inter-node/intra-node communications. The former fully utilizes the hardware based shared memory of an SMP, and the latter takes advantage of the inherent hierarchy of the communications within a cluster of SMPs. Previous studies focused on clusters of SMP from certain vendors. However, the previously proposed methods are not portable to other systems. Because the performance optimization issue is very complicated and the developing process is very time consuming, it is highly desired to have self-tuning, platform-independent implementations. As proven in this dissertation, such an implementation can significantly out-perform the other point-to-point based portable implementations and some platform-specific implementations. The dissertation describes in detail the architecture of the platform-independent implementation. There are four system components: shared memory-based collective communications, overlapping mechanisms for inter-node and intra-node communications, a prediction-based tuning module and a micro-benchmark based tuning module. Each component is carefully designed with the goal of automatic tuning in mind.

  8. A geological and geophysical data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    Sudhakar, T.; Afzulpurkar, S.

    A geological and geophysical data collection system using a Personal Computer is described below. The system stores data obtained from various survey systems typically installed in a charter vessel and can be used for similar applications on any...

  9. The study of data collection method for the plasma properties collection and evaluation system from web

    Science.gov (United States)

    Park, Jun-Hyoung; Song, Mi-Young; Plasma Fundamental Technology Research Team

    2015-09-01

    Plasma databases are necessarily required to compute the plasma parameters and high reliable databases are closely related with accuracy enhancement of simulations. Therefore, a major concern of plasma properties collection and evaluation system is to create a sustainable and useful research environment for plasma data. The system has a commitment to provide not only numerical data but also bibliographic data (including DOI information). Originally, our collection data methods were done by manual data search. In some cases, it took a long time to find data. We will be find data more automatically and quickly than legacy methods by crawling or search engine such as Lucene.

  10. Automatic fault detection on BIPV systems without solar irradiation data

    CERN Document Server

    Leloux, Jonathan; Luna, Alberto; Desportes, Adrien

    2014-01-01

    BIPV systems are small PV generation units spread out over the territory, and whose characteristics are very diverse. This makes difficult a cost-effective procedure for monitoring, fault detection, performance analyses, operation and maintenance. As a result, many problems affecting BIPV systems go undetected. In order to carry out effective automatic fault detection procedures, we need a performance indicator that is reliable and that can be applied on many PV systems at a very low cost. The existing approaches for analyzing the performance of PV systems are often based on the Performance Ratio (PR), whose accuracy depends on good solar irradiation data, which in turn can be very difficult to obtain or cost-prohibitive for the BIPV owner. We present an alternative fault detection procedure based on a performance indicator that can be constructed on the sole basis of the energy production data measured at the BIPV systems. This procedure does not require the input of operating conditions data, such as solar ...

  11. NLO error propagation exercise data collection system

    International Nuclear Information System (INIS)

    A combined automated and manual system for data collection is described. The system is suitable for collecting, storing, and retrieving data related to nuclear material control at a bulk processing facility. The system, which was applied to the NLO operated Feed Materials Production Center, was successfully demonstrated for a selected portion of the facility. The instrumentation consisted of off-the-shelf commercial equipment and provided timeliness, convenience, and efficiency in providing information for generating a material balance and performing error propagation on a sound statistical basis

  12. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Bowler, Matthew W., E-mail: mbowler@embl.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 avenue des Martyrs, F-38042 Grenoble (France); Université Grenoble Alpes-EMBL-CNRS, 71 avenue des Martyrs, F-38042 Grenoble (France); Nurizzo, Didier, E-mail: mbowler@embl.fr; Barrett, Ray; Beteva, Antonia; Bodin, Marjolaine [European Synchrotron Radiation Facility, 71 avenue des Martyrs, F-38043 Grenoble (France)

    2015-10-03

    MASSIF-1 (ID30A-1) is a new beamline dedicated to the completely automatic characterization and data collection from crystals of biological macromolecules. MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined.

  13. Automatic Boat Identification System for VIIRS Low Light Imaging Data

    Directory of Open Access Journals (Sweden)

    Christopher D. Elvidge

    2015-03-01

    Full Text Available The ability for satellite sensors to detect lit fishing boats has been known since the 1970s. However, the use of the observations has been limited by the lack of an automatic algorithm for reporting the location and brightness of offshore lighting features arising from boats. An examination of lit fishing boat features in Visible Infrared Imaging Radiometer Suite (VIIRS day/night band (DNB data indicates that the features are essentially spikes. We have developed a set of algorithms for automatic detection of spikes and characterization of the sharpness of spike features. A spike detection algorithm generates a list of candidate boat detections. A second algorithm measures the height of the spikes for the discard of ionospheric energetic particle detections and to rate boat detections as either strong or weak. A sharpness index is used to label boat detections that appear blurry due to the scattering of light by clouds. The candidate spikes are then filtered to remove features on land and gas flares. A validation study conducted using analyst selected boat detections found the automatic algorithm detected 99.3% of the reference pixel set. VIIRS boat detection data can provide fishery agencies with up-to-date information of fishing boat activity and changes in this activity in response to new regulations and enforcement regimes. The data can provide indications of illegal fishing activity in restricted areas and incursions across Exclusive Economic Zone (EEZ boundaries. VIIRS boat detections occur widely offshore from East and Southeast Asia, South America and several other regions.

  14. VXIbus data collection system -- A design study

    International Nuclear Information System (INIS)

    The German support program has sponsored the work to investigate the VXIbus as integration platform for safeguards instrumentation. This paper will cover the analysis of the user requirements for a VXIbus based monitoring system for integrated safeguards -- primarily for reliable unattended in-field collection of large amounts of data. The goal is to develop a suitable system architecture. The design of the system makes use of the VXIbus standard as the selected hardware platform Based upon the requirement analysis and the overriding need for high reliability and robustness, a systematic investigation of different operating system options, as well as development and integration tools will be considered. For the software implementation cycle high and low level programming tools are required. The identification of the constraints for the programming platform and the tool selection will be presented. Both the strategic approach, the rules for analysis and design work as well as the executive components for the support of the implementation and production cycle are given. Here all the conditions for reliable, unattended and integrated safeguards monitoring systems will be addressed. The definition of the basic and advanced design principles are covered. The paper discusses the results of a study on a system produced to demonstrate a high data rate timer/counter application

  15. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  16. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  17. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules

    Science.gov (United States)

    Bowler, Matthew W.; Nurizzo, Didier; Barrett, Ray; Beteva, Antonia; Bodin, Marjolaine; Caserotto, Hugo; Delagenière, Solange; Dobias, Fabian; Flot, David; Giraud, Thierry; Guichard, Nicolas; Guijarro, Mattias; Lentini, Mario; Leonard, Gordon A.; McSweeney, Sean; Oskarsson, Marcus; Schmidt, Werner; Snigirev, Anatoli; von Stetten, David; Surr, John; Svensson, Olof; Theveneau, Pascal; Mueller-Dieckmann, Christoph

    2015-01-01

    MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined. PMID:26524320

  18. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules.

    Science.gov (United States)

    Bowler, Matthew W; Nurizzo, Didier; Barrett, Ray; Beteva, Antonia; Bodin, Marjolaine; Caserotto, Hugo; Delagenière, Solange; Dobias, Fabian; Flot, David; Giraud, Thierry; Guichard, Nicolas; Guijarro, Mattias; Lentini, Mario; Leonard, Gordon A; McSweeney, Sean; Oskarsson, Marcus; Schmidt, Werner; Snigirev, Anatoli; von Stetten, David; Surr, John; Svensson, Olof; Theveneau, Pascal; Mueller-Dieckmann, Christoph

    2015-11-01

    MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined.

  19. System design of the METC automatic data acquisition and control system

    Energy Technology Data Exchange (ETDEWEB)

    Goff, D. R.; Armstrong, D. L.

    1982-02-01

    A system of computer programs and hardware was developed by the Instrumentation Branch of the Morgantown Energy Technology Center (METC) to provide data acquisition and control features for research projects at the site. The Automatic Data Acquisition and Control System (ADACS) has the capability of servicing up to eight individual projects simultaneously, providing data acquisition, data feedback, and process control where needed. Several novel software features - including a data table driven program, extensive feedback in real time, free format English commands, and high reliability - were incorporated to provide these functions.

  20. Analysis of space telescope data collection system

    Science.gov (United States)

    Ingles, F.; Schoggen, W. O.

    1980-01-01

    The effects of frame synchronization loss were analyzed. A frame sync loss will create loss of data for the frame in which it occurs (since one would not know whether the preceding data was properly in sync or not) and during search from frame sync the system would be losing data. The search mode for reacquisition utilizes multiple search procedures.

  1. 78 FR 68816 - Proposed Information Collection; Comment Request; NOAA Space-Based Data Collection System (DCS...

    Science.gov (United States)

    2013-11-15

    ... National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; NOAA Space- Based Data Collection System (DCS) Agreements AGENCY: National Oceanic and Atmospheric... National Ocean and Atmospheric Administration (NOAA) operates two space-based data collection systems...

  2. 75 FR 59686 - Proposed Information Collection; Comment Request; NOAA Space-Based Data Collection System (DCS...

    Science.gov (United States)

    2010-09-28

    ... National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; NOAA Space- Based Data Collection System (DCS) Agreements AGENCY: National Oceanic and Atmospheric... space-based data collection systems (DCS), the Geostationary Operational Environmental Satellite...

  3. Automatic testing system design and data analysis of permafrost temperature in Qinghai-Tibet Railway

    Institute of Scientific and Technical Information of China (English)

    尚迎春; 齐红元

    2008-01-01

    Aimed at the characteristics of permafrost temperature influencing the safety of Qinghai-Tibet Railway and its on-line testing system, comparing the achievement of permafrost study nationwide with those worldwide, an automatic testing system of permafrost temperature, containing a master computer and some slave computers, was designed. By choosing high-precise thermistors as temperature sensor, designing and positioning the depth and interval of testing sections, testing, keeping and sending permafrost temperature data at time over slave computers, and receiving, processing and analyzing the data of collecting permafrost temperature over master computer, the change of the permafrost temperature can be described and analyzed, which can provide information for permafrost railway engineering design. Moreover, by taking permafrost temperature testing in a certain section of Qinghai-Tibet Railway as an instance, the collected data of permafrost temperature were analyzed, and the effect of permafrost behavior was depicted under the railway, as well as, a BP model was set up to predict the permafrost characteristics. This testing system will provide information timely about the change of the permafrost to support the safety operation in Qinghai-Tibet Railway.

  4. Sensor Systems Collect Critical Aerodynamics Data

    Science.gov (United States)

    2010-01-01

    With the support of Small Business Innovation Research (SBIR) contracts with Dryden Flight Research Center, Tao of Systems Integration Inc. developed sensors and other components that will ultimately form a first-of-its-kind, closed-loop system for detecting, measuring, and controlling aerodynamic forces and moments in flight. The Hampton, Virginia-based company commercialized three of the four planned components, which provide sensing solutions for customers such as Boeing, General Electric, and BMW and are used for applications such as improving wind turbine operation and optimizing air flow from air conditioning systems. The completed system may one day enable flexible-wing aircraft with flight capabilities like those of birds.

  5. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    International Nuclear Information System (INIS)

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described

  6. Protokol Interchangeable Data pada VMeS (Vessel Messaging System dan AIS (Automatic Identification System

    Directory of Open Access Journals (Sweden)

    Farid Andhika

    2012-09-01

    Full Text Available VMeS (Vessel Messaging System merupakan komunikasi berbasis radio untuk mengirimkan pesan antara VMeS terminal kapal di laut dengan VMeS gateway di darat. Dalam perkembangan sistem monitoring kapal di laut umumnya menggunakan AIS (Automatic Identification System yang telah digunakan di seluruh pelabuhan untuk memantau kondisi kapal dan mencegah tabrakan antar kapal. Dalam penelitian ini akan dirancang format data yang sesuai untuk VMeS agar bisa dilakukan proses interchangeable ke AIS sehingga bisa dibaca oleh AIS receiver yang ditujukan untuk kapal dengan ukuran dibawah 30 GT (Gross Tonnage. Format data VmeS dirancang dalam tiga jenis yaitu data posisi, data informasi kapal dan data pesan pendek yang akan dilakukan interchangeable dengan AIS tipe 1,4 dan 8. Pengujian kinerja sistem interchangeable menunjukkan bahwa dengan peningkatan periode pengiriman pesan maka lama delay total meningkat tetapi packet loss menurun. Pada pengiriman pesan setiap 5 detik dengan kecepatan 0-40 km/jam, 96,67 % data dapat diterima dengan baik. Data akan mengalami packet loss jika level daya terima dibawah -112 dBm . Jarak terjauh yang dapat dijangkau modem dengan kondisi bergerak yaitu informatika ITS dengan jarak 530 meter terhadap Laboratorium B406 dengan level daya terima -110 dBm.

  7. CARPS An integrated proposal and data collection system

    CERN Document Server

    Brister, K

    2002-01-01

    Modern scripting languages and database tools combined provide a new framework for developing beam-line control and data management software. The CARPS system supports data collection by storing low level beam-line control commands in a database and playing these commands back to collect data sets. This system is combined with proposal and data management tools for support of both local and remote users.

  8. Volunteer-based distributed traffic data collection system

    DEFF Research Database (Denmark)

    Balachandran, Katheepan; Broberg, Jacob Honoré; Revsbech, Kasper;

    2010-01-01

    them into flows and send the flow data to a central server. Data can be used for studying and characterising internet traffic and for testing traffic models by regenerating real traffic. The architecture is designed to have efficient and light usage of resources on both client and server sides. Worst......An architecture for a traffic data collection system is proposed, which can collect data without having access to a backbone network. Contrary to other monitoring systems it relies on volunteers to install a program on their own computers, which will capture incoming and outgoing packets, group...

  9. AUTOMATIC RECOGNITION OF PIPING SYSTEM FROM LARGE-SCALE TERRESTRIAL LASER SCAN DATA

    Directory of Open Access Journals (Sweden)

    K. Kawashima

    2012-09-01

    Full Text Available Recently, changes in plant equipment have been becoming more frequent because of the short lifetime of the products, and constructing 3D shape models of existing plants (as-built models from large-scale laser scanned data is expected to make their rebuilding processes more efficient. However, the laser scanned data of the existing plant has massive points, captures tangled objects and includes a large amount of noises, so that the manual reconstruction of a 3D model is very time-consuming and costs a lot. Piping systems especially, account for the greatest proportion of plant equipment. Therefore, the purpose of this research was to propose an algorithm which can automatically recognize a piping system from terrestrial laser scan data of the plant equipment. The straight portion of pipes, connecting parts and connection relationship of the piping system can be recognized in this algorithm. Eigenvalue analysis of the point clouds and of the normal vectors allows for the recognition. Using only point clouds, the recognition algorithm can be applied to registered point clouds and can be performed in a fully automatic way. The preliminary results of the recognition for large-scale scanned data from an oil rig plant have shown the effectiveness of the algorithm.

  10. Data Collection via Synthetic Aperture Radiometry towards Global System

    Directory of Open Access Journals (Sweden)

    Ali. A. J.Al-Sabbagh

    2015-10-01

    Full Text Available Nowadays it is widely accepted that remote sensing is an efficient way of large data management philosophy. In this paper, we present a future view of the big data collection by synthetic aperture radiometry as a passive microwave remote sensing towards building a global monitoring system. Since the collected data may not have any value, it is mandatory to analyses these data in order to get valuable and beneficial information with respect to their base data. The collected data by synthetic aperture radiometry is one of the high resolution earth observation, these data will be an intensive problems, Meanwhile, Synthetic Aperture Radar able to work in several bands, X, C, S, L and P-band. The important role of synthetic aperture radiometry is how to collect data from areas with inadequate network infrastructures where the ground network facilities were destroyed. The future concern is to establish a new global data management system, which is supported by the groups of international teams working to develop technology based on international regulations. There is no doubt that the existing techniques are so limited to solve big data problems totally. There is a lot of work towards improving 2- D and 3-D SAR to get better resolution.

  11. Calibration of Frequency Data Collection Systems Using Shortwave Radio Signals

    Science.gov (United States)

    Estler, Ron

    2000-09-01

    The atomic-clock-derived audio tones broadcast on the National Institute of Standards and Technology (NIST) shortwave station WWV are used to calibrate computer frequency data collection systems via Fast Fourier Transforms (FFT). Once calibrated, the data collection system can be used to accurately determine the audio signals used in several instructional physical chemistry laboratory experiments. This method can be applied to virtually any hardware-software configuration that allows adjustment of the apparent time scale (digitizing rate) of the recorded audio file.

  12. A speech recognition system for data collection in precision agriculture

    Science.gov (United States)

    Dux, David Lee

    Agricultural producers have shown interest in collecting detailed, accurate, and meaningful field data through field scouting, but scouting is labor intensive. They use yield monitor attachments to collect weed and other field data while driving equipment. However, distractions from using a keyboard or buttons while driving can lead to driving errors or missed data points. At Purdue University, researchers have developed an ASR system to allow equipment operators to collect georeferenced data while keeping hands and eyes on the machine during harvesting and to ease georeferencing of data collected during scouting. A notebook computer retrieved locations from a GPS unit and displayed and stored data in Excel. A headset microphone with a single earphone collected spoken input while allowing the operator to hear outside sounds. One-, two-, or three-word commands activated appropriate VBA macros. Four speech recognition products were chosen based on hardware requirements and ability to add new terms. After training, speech recognition accuracy was 100% for Kurzweil VoicePlus and Verbex Listen for the 132 vocabulary words tested, during tests walking outdoors or driving an ATV. Scouting tests were performed by carrying the system in a backpack while walking in soybean fields. The system recorded a point or a series of points with each utterance. Boundaries of points showed problem areas in the field and single points marked rocks and field corners. Data were displayed as an Excel chart to show a real-time map as data were collected. The information was later displayed in a GIS over remote sensed field images. Field corners and areas of poor stand matched, with voice data explaining anomalies in the image. The system was tested during soybean harvest by using voice to locate weed patches. A harvester operator with little computer experience marked points by voice when the harvester entered and exited weed patches or areas with poor crop stand. The operator found the

  13. Automatic layout of ventilation systems by means of electronic data processing

    Energy Technology Data Exchange (ETDEWEB)

    Altena, H.; Priess, H.; Fries, E.; Hoffmann, G.

    1982-12-09

    A working group developed a mehtod for the automatic projection of ventilation systems by means of electronic data processing. The purpose of this was to increase the information content of this document and to obtain a useful tool for ventilation planning while reducing the efforts required for elaboration of ventilation plans. A program system was developed by means of which ventilation plans can be plotted in consideration of the regulations set by the mining authorities. The program system was applied for the first time at Osterfeld mine. The plan is clearly organized, accurate, and easy to understand. This positive experience suggests that computer-aided plans should be more widely applied. The mining authorities support this view.

  14. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  15. INfluence of vinasse on water movement in soil, using automatic acquisition and handling data system

    International Nuclear Information System (INIS)

    The vinasse, by-product of ethylic alcohol industry from the sugar cane juice or molasses yeast fermentation, has been incorporated in the soil as fertilizer, due to the its hight organic matter (2-6%), potassium and sulphate (0,1-0,5%) and other nutrient contents. By employing monoenergetic gamma-ray beam attenuation technique (241Am; 59,5 keV; 100 mCi) the influence of vinasse on the water movement in the soil was studied. For this, an automatic acquisition and handling data system was used, based in multichannel analyser, multi-scaling mode operated, coupled to a personal microcomputer and plotter. Despite the small depth studied (6 cm), it was observed that vinasse decreases the water infiltration velocity in the soil. (Author)

  16. The data collection system for failure/maintenance at the Tritium Systems Test Assembly

    International Nuclear Information System (INIS)

    A data collection system for obtaining information which can be used to help determine the reliability and vailability of future fusion power plants has been installed at the Los Alamos National Laboratory's Tritium Systems Test Assembly (TSTA). Failure and maintenance data on components of TSTA's tritium systems have been collected since 1984. The focus of the data collection has been TSTA's Tritium Waste Tratment System (TWT), which has maintained high availability since it became operation in 1982. Data collection is still in progress and a total of 291 failure reports are in the data collection system at this time, 47 of which are from the TWT. 6 refs., 2 figs., 2 tabs

  17. Automatic Classification of the Vestibulo-Ocular Reflex Nystagmus: Integration of Data Clustering and System Identification.

    Science.gov (United States)

    Ranjbaran, Mina; Smith, Heather L H; Galiana, Henrietta L

    2016-04-01

    The vestibulo-ocular reflex (VOR) plays an important role in our daily activities by enabling us to fixate on objects during head movements. Modeling and identification of the VOR improves our insight into the system behavior and improves diagnosis of various disorders. However, the switching nature of eye movements (nystagmus), including the VOR, makes dynamic analysis challenging. The first step in such analysis is to segment data into its subsystem responses (here slow and fast segment intervals). Misclassification of segments results in biased analysis of the system of interest. Here, we develop a novel three-step algorithm to classify the VOR data into slow and fast intervals automatically. The proposed algorithm is initialized using a K-means clustering method. The initial classification is then refined using system identification approaches and prediction error statistics. The performance of the algorithm is evaluated on simulated and experimental data. It is shown that the new algorithm performance is much improved over the previous methods, in terms of higher specificity. PMID:26357393

  18. Automatic Classification of the Vestibulo-Ocular Reflex Nystagmus: Integration of Data Clustering and System Identification.

    Science.gov (United States)

    Ranjbaran, Mina; Smith, Heather L H; Galiana, Henrietta L

    2016-04-01

    The vestibulo-ocular reflex (VOR) plays an important role in our daily activities by enabling us to fixate on objects during head movements. Modeling and identification of the VOR improves our insight into the system behavior and improves diagnosis of various disorders. However, the switching nature of eye movements (nystagmus), including the VOR, makes dynamic analysis challenging. The first step in such analysis is to segment data into its subsystem responses (here slow and fast segment intervals). Misclassification of segments results in biased analysis of the system of interest. Here, we develop a novel three-step algorithm to classify the VOR data into slow and fast intervals automatically. The proposed algorithm is initialized using a K-means clustering method. The initial classification is then refined using system identification approaches and prediction error statistics. The performance of the algorithm is evaluated on simulated and experimental data. It is shown that the new algorithm performance is much improved over the previous methods, in terms of higher specificity.

  19. Development of teacher schedule automatic collection system based on Visual Basic%基于Visual Basic的教师课表自动汇总系统开发

    Institute of Scientific and Technical Information of China (English)

    刘信香

    2012-01-01

    In this paper, according to the need of practice, a set of teachers' schedules automatic collection system is developed based on Visual Basic program. The sehedule data can be read automatically, and the total schedule file can be generated automatically after the data be collected in this system. This system can replace the artificial tedious duplication of effort, with high efficiency, and convenient.%根据实际工作的需要,开发了一套基于Visual Basic程序的教师课表自动汇总系统。该系统可自动读取课表数据,并将读取的数据汇总后自动生成总课表文件,可代替人工的繁琐重复劳动,具有效率高、方便快捷的特点。

  20. The Processing of Image Data Collected by Light UAV Systems for GIS Data Capture and Updating

    OpenAIRE

    N. Yastikli; I. Bagci; C. Beser

    2013-01-01

    The collection and updating of 3D data is the one of the important steps for GIS applications which require fast and efficient data collection methods. The photogrammetry has been used for many years as a data collection method for GIS application in larger areas. The Unmanned Aerial Vehicles (UAV) Systems gained increasing attraction in geosciences for cost effective data capture and updating at high spatial and temporal resolution during the last years. These autonomously flying UA...

  1. Parallel Plate System for Collecting Data Used to Determine Viscosity

    Science.gov (United States)

    Kaukler, William (Inventor); Ethridge, Edwin C. (Inventor)

    2013-01-01

    A parallel-plate system collects data used to determine viscosity. A first plate is coupled to a translator so that the first plate can be moved along a first direction. A second plate has a pendulum device coupled thereto such that the second plate is suspended above and parallel to the first plate. The pendulum device constrains movement of the second plate to a second direction that is aligned with the first direction and is substantially parallel thereto. A force measuring device is coupled to the second plate for measuring force along the second direction caused by movement of the second plate.

  2. Automatically Collecting and Monitoring Japanese Weblogs

    Science.gov (United States)

    Nanno, Tomoyuki; Suzuki, Yasuhiro; Fujiki, Toshiaki; Okumura, Manabu

    Weblogs (blogs) are now thought of as a potentially useful information source. Although the definition of blogs is not necessarily definite, it is generally understood that they are personal web pages authored by a single individual and made up of a sequence of dated entries of the author's thoughts, that are arranged chronologically. In Japan, since long before blog software became available, people have written `diaries' on the web. These web diaries are quite similar to blogs in their content, and people still write them without any blog software. As we will show, hand-edited blogs are quite numerous in Japan, though most people now think of blogs as pages usually published using one of the variants of public-domain blog software. Therefore, it is quite difficult to exhaustively collect Japanese blogs, i.e., collect blogs made with blog software and web diaries written as normal web pages. With this as the motivation for our work, we present a system that tries to automatically collect and monitor Japanese blog collections that include not only ones made with blog software but also ones written as normal web pages. Our approach is based on extraction of date expressions and analysis of HTML documents, to avoid having to depend on specific blog software, RSS, or the ping server.

  3. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    Science.gov (United States)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  4. Cyber security and data collection approaches for smartphone sensor systems

    Science.gov (United States)

    Turner, Hamilton; White, Jules

    2012-06-01

    In recent years the ubiquity and resources provided by smartphone devices have encouraged scientists to explore using these devices as remote sensing nodes. In addition, the United States Department of Defense has stated a mission of increasing persistent intelligence, surveillance, and reconnaissance capabilities or U.S. units. This paper presents a method of enabling large-scale, long-term smartphone-powered data collection. Key solutions discussed include the ability to directly allow domain experts to define and refine smartphone applications for data collection, technical advancements that allow rapid dissemination of a smartphone data collection application, and an algorithm for preserving the locational privacy of participating users.

  5. Data mining process automatization of air pollution data by the LISp-Miner system

    OpenAIRE

    Ochodnická, Zuzana

    2014-01-01

    This thesis is focused on the area of automated data mining. The aim of this thesis is a description of the area of automated data mining, creation of a design of an automated data mining tasks creation process for verification of set domain knowledge and new knowledge search, and also an implementation of verification of set domain knowledge of attribute dependency type influence with search space adjustments. The implementation language is the LMCL language that enables usage of the LISp-Mi...

  6. GOES data-collection system instrumentation, installation, and maintenance manual

    Science.gov (United States)

    Blee, J.W.; Herlong, H.E.; Kaufmann, C.D., Jr.; Hardee, J.H.; Field, M.L.; Middelburg, R.F.

    1986-01-01

    The purpose of the manual is to describe the installation, operation, and maintenance of Geostationary Operational Environmental Satellite (GOES) data collection platforms (DCP's) and associated equipment. This manual is not a substitute for DCP manufacturers ' manuals but is additional material that describes the application of data-collection platforms in the Water Resources Division. Power supplies, encoders, antennas, Mini Monitors, voltage analog devices, and the installation of these at streamflow-gaging stations are discussed in detail. (USGS)

  7. The Processing of Image Data Collected by Light UAV Systems for GIS Data Capture and Updating

    Science.gov (United States)

    Yastikli, N.; Bagci, I.; Beser, C.

    2013-10-01

    The collection and updating of 3D data is the one of the important steps for GIS applications which require fast and efficient data collection methods. The photogrammetry has been used for many years as a data collection method for GIS application in larger areas. The Unmanned Aerial Vehicles (UAV) Systems gained increasing attraction in geosciences for cost effective data capture and updating at high spatial and temporal resolution during the last years. These autonomously flying UAV systems are usually equipped with different sensors such as GPS receiver, microcomputers, gyroscopes and miniaturized sensor systems for navigation, positioning, and mapping purposes. The UAV systems can be used for data collection for digital elevation model DEM and orthoimages generation in GIS application at small areas. In this study, data collection and processing by light UAV system will be evaluated for GIS data capture and updating for small areas where not feasible for traditional photogrammetry. The main aim of this study is to design the low cost light UAV system for GIS data capture and update. The investigation was based on the aerial images which recorded during the flights performed with UAV system over the test site in Davutpasa Campus of Yildiz Technical University, Istanbul. The quality of generated DEM and ortho-images from UAV flights was discussed for GIS data capture and updating for small areas.

  8. Collection and evaluation of salt mixing data with the real time data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Glazer, S.; Chiu, C.; Todreas, N.E.

    1977-09-01

    A minicomputer based real time data acquisition system was designed and built to facilitate data collection during salt mixing tests in mock ups of LMFBR rod bundles. The system represents an expansion of data collection capabilities over previous equipment. It performs steady state and transient monitoring and recording of up to 512 individual electrical resistance probes. Extensive real time software was written to govern all phases of the data collection procedure, including probe definition, probe calibration, salt mixing test data acquisition and storage, and data editing. Offline software was also written to permit data examination and reduction to dimensionless salt concentration maps. Finally, the computer program SUPERENERGY was modified to permit rapid extraction of parameters from dimensionless salt concentration maps. The document describes the computer system, and includes circuit diagrams of all custom built components. It also includes descriptions and listings of all software written, as well as extensive user instructions.

  9. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  10. JAPS: an automatic parallelizing system based on JAVA

    Institute of Scientific and Technical Information of China (English)

    杜建成; 陈道蓄; 谢立

    1999-01-01

    JAPS is an automatic parallelizing system based on JAVA running on NOW. It implements the automatic process from dependence analysis to parallel execution. The current version of JAPS can exploit functional parallelism and the detection of data parallelism will be incorporated in the new version, which is underway. The framework and key techniques of JAPS are presented. Specific topics discussed are task partitioning, summary information collection, data dependence analysis, pre-scheduling and dynamic scheduling, etc.

  11. Data Collection and Cost Modeling for Library Circulation Systems.

    Science.gov (United States)

    Bourne, Charles P.

    The objectives of the study leading to this report were to review, analyze and summarize published library cost data; and to develop a cost model and a methodology for reporting data in a more consistent and useful way. The cost model and reporting procedure were developed and tested on the circulation system of three libraries: a large university…

  12. Designing a Method for AN Automatic Earthquake Intensities Calculation System Based on Data Mining and On-Line Polls

    Science.gov (United States)

    Liendo Sanchez, A. K.; Rojas, R.

    2013-05-01

    Seismic intensities can be calculated using the Modified Mercalli Intensity (MMI) scale or the European Macroseismic Scale (EMS-98), among others, which are based on a serie of qualitative aspects related to a group of subjective factors that describe human perception, effects on nature or objects and structural damage due to the occurrence of an earthquake. On-line polls allow experts to get an overview of the consequences of an earthquake, without going to the locations affected. However, this could be a hard work if the polls are not properly automated. Taking into account that the answers given to these polls are subjective and there is a number of them that have already been classified for some past earthquakes, it is possible to use data mining techniques in order to automate this process and to obtain preliminary results based on the on-line polls. In order to achieve these goal, a predictive model has been used, using a classifier based on a supervised learning techniques such as decision tree algorithm and a group of polls based on the MMI and EMS-98 scales. It summarized the most important questions of the poll, and recursive divides the instance space corresponding to each question (nodes), while each node splits the space depending on the possible answers. Its implementation was done with Weka, a collection of machine learning algorithms for data mining tasks, using the J48 algorithm which is an implementation of the C4.5 algorithm for decision tree models. By doing this, it was possible to obtain a preliminary model able to identify up to 4 different seismic intensities with 73% correctly classified polls. The error obtained is rather high, therefore, we will update the on-line poll in order to improve the results, based on just one scale, for instance the MMI. Besides, the integration of automatic seismic intensities methodology with a low error probability and a basic georeferencing system, will allow to generate preliminary isoseismal maps

  13. Definition of an automatic information retrieval system independent from the data base used

    International Nuclear Information System (INIS)

    A bibliographic information retrieval system using data stored at the standardized interchange format ISO 2709 or ANSI Z39.2, is specified. A set of comands for interchange format manipulation wich allows the data access at the logical level, achieving the data independence, are used. A data base description language, a storage structure and data base manipulation comands are specified, using retrieval techniques which consider the applications needs. (Author)

  14. AUTOMATIC DESIGNING OF POWER SUPPLY SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. I. Kirspou

    2016-01-01

    Full Text Available Development of automatic designing system for power supply of industrial enterprises is considered in the paper. Its complete structure and principle of operation are determined and established. Modern graphical interface and data scheme are developed, software is completely realized. Methodology and software correspond to the requirements of the up-to-date designing, describe a general algorithm of program process and also reveals properties of automatic designing system objects. Automatic designing system is based on module principle while using object-orientated programming. Automatic designing system makes it possible to carry out consistently designing calculations of power supply system and select the required equipment with subsequent output of all calculations in the form of explanatory note. Automatic designing system can be applied by designing organizations under conditions of actual designing.

  15. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  16. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  17. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  18. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  19. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; Vries, A.P. de; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive manua

  20. A portable wireless data collection system by using optical power supply and photo-communication

    International Nuclear Information System (INIS)

    For aiming at effective application to annual change management of patrolling inspection data and so forth, a portable wireless measuring and data collection device measurable to vibration, temperature and so forth automatically and for short time under patrolling of inspectors and collectable on sensor signals at many places, to collect field data as electronized data. This device was comprised of a sensor head to mount on an object apparatus to transmit sensor signals and a sensor terminal brought by an inspector and with functions to receive and memory a signal from the sensor head. It had a characteristics capable of wireless data collection using optical power supply and photo-communication where all of power supply to sensor head and transmission and receiving of data were conducted optically. As a result, some characteristics could be realized such as perfect realization of wireless data collection and reduction of maintenance burden without its need on installation of source, signal wire, and so forth, possibility to collect data for short time from distant place, and possibility to conduct high order treatment due to obtaining native waveform signal but no conventional numerical data, and possibility of development on apparatus diagnosis such as detection of abnormal sign and others. (G.K.)

  1. Feasibility study for adding a demand failure data collection system to the Nuclear Plant Reliability Data System. Final report

    International Nuclear Information System (INIS)

    Southwest Research Institute (SwRI) is pleased to submit to Sandia National Laboratories this technical report as fulfillment of Task 5 of the proposal entitled A Feasibility Study for Adding a Duty Cycle Data Collection System to the Nuclear Plant Reliability Data System. The purpose of this report is to summarize the work as delineated in the proposal tasks and to recommend follow-on activities. Technical support for this work was provided by Duke Power Company (Duke), subcontractor to SwRI. The four tasks to be performed in conjunction with the Duty Cycle Data Collection Study (renamed in this report Demand Data Collection) were: define component population and measurable parameters; develop data collection and assessment methodologies; assess the impact on utilities; and assess the impact on NPRDS

  2. Making sense of sensor data : detecting clinical mastitis in automatic milking systems

    NARCIS (Netherlands)

    Kamphuis, C.

    2010-01-01

    Farmers milking dairy cows are obliged to exclude milk with abnormal homogeneity or color for human consumption (e.g., Regulation (EC) No 853/2004), where most abnormal milk is caused by clinical mastitis (CM). With automatic milking (AM), farmers are no longer physically present during the milking

  3. Progress on Statistical Learning Systems as Data Mining Tools for the Creation of Automatic Databases in Fusion Environments

    International Nuclear Information System (INIS)

    Fusion devices produce tens of thousands of discharges but only a very limited part of the collected information is analysed. The analysis of physical events requires their identification and temporal location and the generation of specialized databases in relation to these time instants. The automatic determination of precise time instants in which events happen and the automatic search for potential relevant time intervals could be made thanks to classification techniques and regression techniques. Classification and regression techniques have been used for the automatic creation of specialized databases for JET and have allowed the automatic determination of disruptive / non-disruptive character of discharges. The validation of the recognition method has been carried out with 4400 JET discharges and the global success rate has been 99.02 per cent

  4. Real time Aanderaa current meter data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G.

    Aanderaa current meters are widely used for recording the current speed and such other 4 parameters by deploying them over extended period of time. Normally data are recorded on magnetic tape and after recovery of current meters, data are read...

  5. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  6. XML-Based Automatic Test Data Generation

    OpenAIRE

    Halil Ibrahim Bulbul; Turgut Bakir

    2012-01-01

    Software engineering aims at increasing quality and reliability while decreasing the cost of the software. Testing is one of the most time-consuming phases of the software development lifecycle. Improvement in software testing results in decrease in cost and increase in quality of the software. Automation in software testing is one of the most popular ways of software cost reduction and reliability improvement. In our work we propose a system called XML-based automatic test data generation th...

  7. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  8. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  9. Computer Science technology applied to data collection and data managament

    OpenAIRE

    Guzmán, O.; González, M.; Carrasco, J. (Juan); Bernal, C.; Vera, C.; Troncoso, M.

    2009-01-01

    IFOP, as non profit marine research institute has the mission to provide to the Under Secretariat of Fisheries in Chile, the technical information and scientific basis for the regulation of Chilean Fisheries. For this purpose it has 150 Scientific Observers distributed throughout the Chilean coast. With the intention to improve the process of data production, a group of scientists has developed a new computer science system for data collection, data management, and automatic publication of f...

  10. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Directory of Open Access Journals (Sweden)

    T. A. Boden

    2013-02-01

    Full Text Available The Carbon Dioxide Information Analysis Center (CDIAC at Oak Ridge National Laboratory (ORNL, USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP based data-interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent

  11. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Directory of Open Access Journals (Sweden)

    T. A. Boden

    2013-06-01

    Full Text Available The Carbon Dioxide Information Analysis Center (CDIAC at Oak Ridge National Laboratory (ORNL, USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent

  12. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Science.gov (United States)

    Boden, T. A.; Krassovski, M.; Yang, B.

    2013-06-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database

  13. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  14. Data acquisition system for TRIGA Mark I nuclear reactor and a proposal for its automatic operation

    International Nuclear Information System (INIS)

    The TRIGA IPR-R1 Nuclear Research Reactor, located at the Nuclear Technology Development Center (CDTN/CNEN) in Belo Horizonte, Brazil, is being operated since 44 years ago. During these years the main operational parameters were monitored by analog recorders and counters located in the reactor control console. The most important operational parameters and data in the reactor logbook were registered by the reactor operators. This process is quite useful, but it can involve some human errors. It is also impossible for the operators to take notes of all variables involving the process mainly during fast power transients operations. A PC-based Data Acquisition was developed for the reactor that allows on line monitoring, through graphic interfaces, and shows operational parameters evolution to the operators. Some parameters that never were measured on line, like the thermal power and the coolant flow rate at the primary loop, are monitored now in the computer video monitor. The developed system allows measure out all parameters in a frequency up to 1 kHz. These data is also recorded in text files available for consults and analysis. (author)

  15. Special Data Collection System (SDCS) NTS Event 'Bulkhead', 27 April 1977. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Naylor, G.R.; Dawkins, M.S.

    1978-12-18

    This event report contains seismic data from the Special Data Collection System (SDCS), and other sources for the 'Bulkhead' event. The report also contains epicenter information from seismic observations.

  16. Overview of the SOFIA Data Processing System: A generalized system for manual and automatic data processing at the SOFIA Science Center

    Science.gov (United States)

    Shuping, Ralph; Krzaczek, Robert; Vacca, William D.; Charcos-Llorens, Miguel; Reach, William T.; Alles, Rosemary; Clarke, Melanie; Melchiorri, Riccardo; Radomski, James T.; Shenoy, Sachindev S.; Sandel, David; Omelian, Eric

    2015-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5-meter telescope mounted in the aft section of a Boeing 747SP aircraft. SOFIA is designed to execute observations at altitudes between 37,000 and 45,00 feet, above 99% of atmospheric water vapor. During routine operations, several instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR. Raw data obtained in-flight require a significant amount of processing to correct for background emission (from both the telescope and atmosphere), remove instrumental artifacts, correct for atmospheric absorption, and apply both wavelength and flux calibration. In general, this processing is highly specific to the instrument and telescope. Once this post-processing is complete, the data can be used in scientific analysis and publications. In order to maximize the scientific output of the observatory, the SOFIA Science Center must provide these post-processed data sets to Guest Investigators in a timely manner. To meet this requirement, we have designed and built the SOFIA Data Processing System (DPS): an in-house set of tools and services that can be used in both automatic ("pipeline") and manual modes to process data from a variety of instruments. In this poster paper, we present an overview of the DPS concepts and architecture, as well as operational results from the first two SOFIA observing cycles (2013--2014).

  17. Automatic subject classification of textual documents using limited or no training data

    OpenAIRE

    Joorabchi, Arash

    2010-01-01

    With the explosive growth in the number of electronic documents available on the internet, intranets, and digital libraries, there is a growing need for automatic systems capable of indexing and organising such large volumes of data more that ever. Automatic Text Classification (ATC) has become one of the principal means for enhancing the performance of information retrieval systems and organising digital libraries and other textual collections. Within this context, the use of ...

  18. Progress on Statistical Learning Systems as Data Mining Tools for the Creation of Automatic Databases in Fusion Environments

    International Nuclear Information System (INIS)

    The term 'Statistical Learning Systems' represents a wide set of methods to tackle specific problems related to classification (estimation of class decision boundaries), regression (determination of an unknown continuous function from noisy samples), and probability density estimation. Here, recent developments of learning systems in Fusion are reviewed. They have been focused on classification and regression problems as a specific way of creating ad-hoc databases of physical events. Classification and regression techniques have been used to determine the exact time instants in which events happen. In this way, databases around these times can be generated. Input data can be time-series data but also video-movies. Occasionally, the massive amount of data to be managed simultaneously forces the use of parallel computing techniques. Hybrid classification methods combining Support Vector Machines (SVM) and Bayesian statistics have been applied in JET for the automatic estimation of transition times between L/H and H/L confinement regimes. A universal regressor model based on SVM regression methods has been developed for the exact location of individual events. It has been applied to the JET database to determine the time instants of disruptions, ELMs and sawteeth activity. In addition, this regressor has been used with video-films to determine relevant temporal segments in JET discharges. Parallel codes have been put into operation to implement classification systems. The methods will be discussed and detailed examples will be given. This document is composed of an abstract followed by the presentation transparencies. (authors)

  19. Curating Virtual Data Collections

    Science.gov (United States)

    Lynnes, C.; Ramapriyan, H.; Leon, A.; Tsontos, V. M.; Liu, Z.; Shie, C. L.

    2015-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) contains a rich set of datasets and related services throughout its many elements. As a result, locating all the EOSDIS data and related resources relevant to particular science theme can be daunting. This is largely because EOSDIS data's organizing principle is affected more by the way they are produced than around the expected end use.Virtual collections oriented around science themes can overcome this by presenting collections of data and related resources that are organized around the user's interest, not around the way the data were produced. Science themes can be: Specific applications (uses) of the data, e.g., landslide prediction Geophysical events (e.g., Hurricane Sandy) A specific science research problem Virtual collections consist of annotated web addresses (URLs) that point to data and related resource addresses, thus avoiding the need to copy all of the relevant data to a single place. These URL addresses can be consumed by a variety of clients, ranging from basic URL downloaders (wget, curl) and web browsers to sophisticated data analysis programs such as the Integrated Data Viewer. Eligible resources include anything accessible via URL: data files: data file URLs data subsets: OPeNDAP, webification or Web Coverage Service URLs data visualizations: Web Map Service data search results: OpenSearch Atom response custom analysis workflows: e.g., Giovanni analysis URL

  20. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    Science.gov (United States)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  1. Automatic multi-modal intelligent seizure acquisition (MISA) system for detection of motor seizures from electromyographic data and motion data

    DEFF Research Database (Denmark)

    Conradsen, Isa; Beniczky, Sándor; Wolf, Peter;

    2012-01-01

    The objective is to develop a non-invasive automatic method for detection of epileptic seizures with motor manifestations. Ten healthy subjects who simulated seizures and one patient participated in the study. Surface electromyography (sEMG) and motion sensor features were extracted as energy...

  2. A novel electronic data collection system for large-scale surveys of neglected tropical diseases.

    Directory of Open Access Journals (Sweden)

    Jonathan D King

    Full Text Available BACKGROUND: Large cross-sectional household surveys are common for measuring indicators of neglected tropical disease control programs. As an alternative to standard paper-based data collection, we utilized novel paperless technology to collect data electronically from over 12,000 households in Ethiopia. METHODOLOGY: We conducted a needs assessment to design an Android-based electronic data collection and management system. We then evaluated the system by reporting results of a pilot trial and from comparisons of two, large-scale surveys; one with traditional paper questionnaires and the other with tablet computers, including accuracy, person-time days, and costs incurred. PRINCIPLE FINDINGS: The electronic data collection system met core functions in household surveys and overcame constraints identified in the needs assessment. Pilot data recorders took 264 (standard deviation (SD 152 sec and 260 sec (SD 122 sec per person registered to complete household surveys using paper and tablets, respectively (P = 0.77. Data recorders felt a lack of connection with the interviewee during the first days using electronic devices, but preferred to collect data electronically in future surveys. Electronic data collection saved time by giving results immediately, obviating the need for double data entry and cross-correcting. The proportion of identified data entry errors in disease classification did not differ between the two data collection methods. Geographic coordinates collected using the tablets were more accurate than coordinates transcribed on a paper form. Costs of the equipment required for electronic data collection was approximately the same cost incurred for data entry of questionnaires, whereas repeated use of the electronic equipment may increase cost savings. CONCLUSIONS/SIGNIFICANCE: Conducting a needs assessment and pilot testing allowed the design to specifically match the functionality required for surveys. Electronic data collection

  3. 77 FR 39985 - Information Collection; Forest Industries and Residential Fuelwood and Post Data Collection Systems

    Science.gov (United States)

    2012-07-06

    ... Resources Planning Act of 1974 and the Forest and Rangeland Renewable Resources Research Act of 1978 require... addressed to: USDA, Forest Service, Attn: Ronald Piva, Northern Research Station, Forest Inventory and... Forest Service Information Collection; Forest Industries and Residential Fuelwood and Post...

  4. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration; Garonne, Vincent; Lassnig, Mario

    2016-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for over a year. However, with the forthcoming start of run-2 and its expected workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing data imbalances as well as storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only ...

  5. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  6. Recent advances in the automatic collection of animal behavior and physiology

    Science.gov (United States)

    The true assessment of an animal's state of being depends on the collection of refined, repeatable data, free from influence of the collection method or observer bias. The last few years have seen significant technical advances in automatic data collection pertaining to the animal and its environmen...

  7. Development of automatic reactor vessel inspection systems: development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. H.; Lim, H. T.; Um, B. G. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine the reactor vessel weldsIn order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed in this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition and analysis software was developed. 11 refs., 6 figs., 9 tabs. (Author)

  8. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  9. Automatic Guidance System for Welding Torches

    Science.gov (United States)

    Smith, H.; Wall, W.; Burns, M. R., Jr.

    1984-01-01

    Digital system automatically guides welding torch to produce squarebutt, V-groove and lap-joint weldments within tracking accuracy of +0.2 millimeter. Television camera observes and traverses weld joint, carrying welding torch behind. Image of joint digitized, and resulting data used to derive control signals that enable torch to track joint.

  10. Learning Diagnostic Diagrams in Transport-Based Data-Collection Systems

    DEFF Research Database (Denmark)

    Tran, Vu The; Eklund, Peter; Cook, Chris

    2014-01-01

    Insights about service improvement in a transit network can be gained by studying transit service reliability. In this paper, a general procedure for constructing a transit service reliability diagnostic (Tsrd) diagram based on a Bayesian network is proposed to automatically build a behavioural...... model from Automatic Vehicle Location (AVL) and Automatic Passenger Counters (APC) data. Our purpose is to discover the variability of transit service attributes and their effects on traveller behaviour. A Tsrd diagram describes and helps to analyse factors affecting public transport by combining domain...... knowledge with statistical data....

  11. 77 FR 5058 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Automatic...

    Science.gov (United States)

    2012-02-01

    ... Federal Register on September 20, 2011 (76 FR 58301). Interested parties are encouraged to send comments...; Automatic Fire Sensor and Warning Devices Systems; Examination and Test Requirements ACTION: Notice. SUMMARY...) ] sponsored information collection request (ICR) titled, ``Automatic Fire Sensor and Warning Devices...

  12. System for Control,Data Collection and Processing in 8 mm Portable Microwave Radiometer—Scatterometer

    Institute of Scientific and Technical Information of China (English)

    李毅; 方振和; 等

    2002-01-01

    In this paper we describe a system used to control,collect and process data in 8mm portable microwave radiometer-scatterometer,We focus on hardware and software design of the system based on a PIC16F874 chip.The system has been successfully used in an 8mm portable microwave radiometer-scatterometer,compared with other similar systems,the system modularization miniatureization and intelligentization are improved so as to meet portable instrument requirements.

  13. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair B.

    2012-01-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific data recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of operating wind turbines. This report is intended to help develop a basic understanding of the data needed for reliability analysis from a Computerized Maintenance Management System (CMMS) and other data systems. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and analysis and reporting needs. The 'Motivation' section of this report provides a rationale for collecting and analyzing field data for reliability analysis. The benefits of this type of effort can include increased energy delivered, decreased operating costs, enhanced preventive maintenance schedules, solutions to issues with the largest payback, and identification of early failure indicators.

  14. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    International Nuclear Information System (INIS)

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  15. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  16. Data collection for global monitoring and trend analysis in the GRACE multi-agent system

    OpenAIRE

    Pereira, Arnaldo; Rodrigues, Nelson; Leitão, Paulo

    2013-01-01

    Multi-agent systems (MAS) paradigm is a suitable approach to implement distributed manufacturing systems addressing the emergent requirements of flexibility, robustness and responsiveness. In such systems, the distributed agents have a local view of the system environment being the global data collection a complex and critical issue to provide the functionalities of the ISA-95 standard, such as dynamic scheduling, maintenance management and quality assurance. This paper describes the da...

  17. Development of a system for data collection and processing by telemetry

    International Nuclear Information System (INIS)

    The environmental impact of nuclear industry is, obviously, a matter of the greatest concern. On account of that, a large number of parameters must be recorded during long periods with a high level of confidence. The site selection of brazilian nuclear power plants is conducted under this philosophy. Data acquisition of ocean related parameters in remote, non explored, areas is rather stringent. In order to avoid a series of problems with data collection and processing, a telemetric system concept was developed. Electronic aspects of this system are, mainly, emphasized. For such purpose the system is splitted into two sub-systems: the former for data collection, signal conditionning and transmission and the latter for signal reception and treatment. All parts of the systems were tested in the laboratory before an integrated check, the corresponding results being encouraging. The whole equipment was installed one year ago at the sea shore region of Peruibe, state of Sao Paulo, and is in operation, adequately, eversince. (Author)

  18. ON GEOMETRIC PROCESSING OF MULTI-TEMPORAL IMAGE DATA COLLECTED BY LIGHT UAV SYSTEMS

    Directory of Open Access Journals (Sweden)

    T. Rosnell

    2012-09-01

    Full Text Available Data collection under highly variable weather and illumination conditions around the year will be necessary in many applications of UAV imaging systems. This is a new feature in rigorous photogrammetric and remote sensing processing. We studied performance of two georeferencing and point cloud generation approaches using image data sets collected in four seasons (winter, spring, summer and autumn and under different imaging conditions (sunny, cloudy, different solar elevations. We used light, quadrocopter UAVs equipped with consumer cameras. In general, matching of image blocks collected with high overlaps provided high quality point clouds. All of the before mentioned factors influenced the point cloud quality. In winter time, the point cloud generation failed on uniform snow surfaces in many situations, and during leaf-off season the point cloud generation was not successful over deciduous trees. The images collected under cloudy conditions provided better point clouds than the images collected in sunny weather in shadowed regions and of tree surfaces. On homogeneous surfaces (e.g. asphalt the images collected under sunny conditions outperformed cloudy data. The tested factors did not influence the general block adjustment results. The radiometric sensor performance (especially signal-to-noise ratio is a critical factor in all weather data collection and point cloud generation; at the moment, high quality, light weight imaging sensors are still largely missing; sensitivity to wind is another potential limitation. There lies a great potential in low flying, low cost UAVs especially in applications requiring rapid aerial imaging for frequent monitoring.

  19. Automatic recovery from resource exhaustion exceptions by collecting leaked resources

    Institute of Scientific and Technical Information of China (English)

    Zi-ying DAI; Xiao-guang MAO; Li-qian CHEN; Yan LEI

    2014-01-01

    Despite the availability of garbage collectors, programmers must manually manage non-memory fi nite system resources such as fi le descriptors. Resource leaks can gradually consume all available resources and cause programs to raise resource exhaustion exceptions. However, programmers commonly provide no effective recovery approach for resource exhaustion exceptions, which often causes programs to halt without completing their tasks. In this paper, we propose to automatically recover programs from resource exhaustion exceptions caused by resource leaks. We transform programs to catch resource exhaustion exceptions, collect leaked resources, and then retry the failure code. A resource collector is designed to identify leaked resources and safely release them. We implement our approach for Java programs. Experimental results show that our approach can successfully handle resource exhaustion exceptions caused by reported resource leaks and allow programs to complete their tasks with an average execution time increase of 2.52%and negligible bytecode size increase.

  20. Fault injection system for automatic testing system

    Institute of Scientific and Technical Information of China (English)

    王胜文; 洪炳熔

    2003-01-01

    Considering the deficiency of the means for confirming the attribution of fault redundancy in the re-search of Automatic Testing System(ATS) , a fault-injection system has been proposed to study fault redundancyof automatic testing system through compurison. By means of a fault-imbeded environmental simulation, thefaults injected at the input level of the software are under test. These faults may induce inherent failure mode,thus bringing about unexpected output, and the anticipated goal of the test is attained. The fault injection con-sists of voltage signal generator, current signal generator and rear drive circuit which are specially developed,and the ATS can work regularly by means of software simulation. The experimental results indicate that the faultinjection system can find the deficiency of the automatic testing software, and identify the preference of fault re-dundancy. On the other hand, some soft deficiency never exposed before can be identified by analyzing the tes-ting results.

  1. Automatic continuous monitoring system for dangerous sites and cargoes

    International Nuclear Information System (INIS)

    The problems of creation of automatic comprehensive continuous monitoring system for nuclear and radiation sites and cargoes of Rosatom Corporation, which carries out data collecting, processing, storage and transmission, including informational support to decision-making, as well as support to modelling and forecasting functions, are considered. The system includes components of two levels: site and industry. Currently the system is used to monitor over 8000 integrated parameters, which characterise the status of nuclear and radiation safety on Rosatom sites, environmental and fire safety

  2. CHLOE: a system for the automatic handling of spark pictures

    International Nuclear Information System (INIS)

    The system for automatic data handling uses commercially available or state-of-the-art components. The system is flexible enough to accept information from various types of experiments involving photographic data acquisition

  3. Research on Chinese Antarctic Data Directory System I——Collecting, processing, examining and submitting data directory

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the general framework of ADDS (Antarctic Data Directory System) established by SCAR-COMNAP ad hoc Planning Group on Antarctic data management, the CN—ADDS (Chinese Antarctic Data Directory System ) project is going on, of which the research and activity keeps to the available method and technique in ADDS development and allows for the Chinese specific status in Antarctic data management as well. At present, authoring and submitting timely Antarctic data directory in China is one of the key issues that is to be dealt with necessarily. This paper aims at studying the technical procedure in collecting, processing, examining and submitting data directory. In additional, it also discusses the efficient collection of data directory, which needs the effort of administrative and technical support

  4. Analysis of ICPP process-monitoring-system data collected during August-October, 1981

    International Nuclear Information System (INIS)

    As part of the FY-1982 advanced safeguards development program, selected process data collected during August-October 1981 by the ICPP Process Monitoring Computer System (PMCS) were analyzed. This analysis is the first major effort of its kind using data from this VAX 11/780 computer based system. These data were from the first, second, and third processing cycles. Several process events were identified and isolated for analysis to conserve limited program resources. These included process input (G-Cell) batch transfers, continuous first-cycle feed activities, transfers into N-Cell intercycle storage, and continuous second-cycle feed activities. The analyses principally used Scanivalve plant precision data from tank bubbler probes, temperature data, and plant digital data. Some useful assessments are given to the process data information, but they should be considered preliminary since not all collected data could be analyzed. Also, several data limitations are noted and recommendations are given for system improvements. It is believed that this analysis effort demonstrates the potential utility of the system for improved safeguards applications; yet, further, similar analysis efforts are needed to extend and complete a demonstration to characterize ICPP process data in general

  5. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.

    2009-09-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

  6. A technical assistance on data collection on subdivision of wet-system apparatuses.

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-09-01

    In the Ningyo-Toge Environmental Engineering Center, development on subdivision engineering system for abolition of nuclear fuel facilities has been promoted. However, data on subdivision of instruments and apparatuses to be carried out as a part of the abolition was insufficient. Therefore, here was intended to investigate data collections so as to use subdivision of a wet-system apparatuses of the smelting conversion facility begun on June, 2000, as a field of data collection as effectively as possible, on construction of the system rationally supporting abolition of nuclear fuel facility promoted at the Ningyo-Toge Environmental Engineering Center. This subdivision of the wet-system apparatuses of the facility is programmed to carry out the subdivision for two years of 2000 and 2001 fiscal years. Its working procedure is begun from non-polluted matters (electrics, instruments, and utility pipings) at every rooms to carry out appliances using uranium. Here were reported on present states survey of the subdivision, kinds and frequencies of data at the subdivision, data collection manual, and rationalization of data recording method. (G.K.)

  7. Development of Inspection Data Collection and Evaluation System (IDES) for J-MOX (1)

    International Nuclear Information System (INIS)

    'Inspection Data and Collection and Evaluation System' is the system to storage inspection data and operator declaration data collected from various measurement equipments, which are installed in fuel fabrication processes of the large-scale MOX fuel fabrication plant, and to make safeguards evaluation using these data. Nuclear Material Control Center is now developing this system under the project commissioned by JSGO. By last fiscal year, we developed the simulator to simulate fuel fabrication process and generate data simulating in-process material inventory/flow and these measurement data. In addition, we developed a verification evaluation system to calculate various statistics from the simulation data and conduct statistical tests such as NRTA in order to verify the adequacy of material accountancy for the fabrication process. We are currently investigating the adequacy of evaluation itself and effects for evaluation by changing various process factors including unmeasured inventories as well as the adequacy of current safeguards approach. In the presentation, we explain the developed system configuration, calculation method of the simulation etc. and demonstrate same examples of the simulated result on material flow in the fabrication process and a part of the analytical results. (author)

  8. Intelligent Storage System Based on Automatic Identification

    Directory of Open Access Journals (Sweden)

    Kolarovszki Peter

    2014-09-01

    Full Text Available This article describes RFID technology in conjunction with warehouse management systems. Article also deals with automatic identification and data capture technologies and each processes, which are used in warehouse management system. It describes processes from entering goods into production to identification of goods and also palletizing, storing, bin transferring and removing goods from warehouse. Article focuses on utilizing AMP middleware in WMS processes in Nowadays, the identification of goods in most warehouses is carried through barcodes. In this article we want to specify, how can be processes described above identified through RFID technology. All results are verified by measurement in our AIDC laboratory, which is located at the University of Žilina, and also in Laboratory of Automatic Identification Goods and Services located in GS1 Slovakia. The results of our research bring the new point of view and indicate the ways using of RFID technology in warehouse management system.

  9. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  10. ANPS - AUTOMATIC NETWORK PROGRAMMING SYSTEM

    Science.gov (United States)

    Schroer, B. J.

    1994-01-01

    Development of some of the space program's large simulation projects -- like the project which involves simulating the countdown sequence prior to spacecraft liftoff -- requires the support of automated tools and techniques. The number of preconditions which must be met for a successful spacecraft launch and the complexity of their interrelationship account for the difficulty of creating an accurate model of the countdown sequence. Researchers developed ANPS for the Nasa Marshall Space Flight Center to assist programmers attempting to model the pre-launch countdown sequence. Incorporating the elements of automatic programming as its foundation, ANPS aids the user in defining the problem and then automatically writes the appropriate simulation program in GPSS/PC code. The program's interactive user dialogue interface creates an internal problem specification file from user responses which includes the time line for the countdown sequence, the attributes for the individual activities which are part of a launch, and the dependent relationships between the activities. The program's automatic simulation code generator receives the file as input and selects appropriate macros from the library of software modules to generate the simulation code in the target language GPSS/PC. The user can recall the problem specification file for modification to effect any desired changes in the source code. ANPS is designed to write simulations for problems concerning the pre-launch activities of space vehicles and the operation of ground support equipment and has potential for use in developing network reliability models for hardware systems and subsystems. ANPS was developed in 1988 for use on IBM PC or compatible machines. The program requires at least 640 KB memory and one 360 KB disk drive, PC DOS Version 2.0 or above, and GPSS/PC System Version 2.0 from Minuteman Software. The program is written in Turbo Prolog Version 2.0. GPSS/PC is a trademark of Minuteman Software. Turbo Prolog

  11. Automatic monitoring system for ''F'' installation

    International Nuclear Information System (INIS)

    The design and operation procedure of the first part of automatic radiation monitoring system of the Laboratory of Nuclear Problems, JINR, (''F'' Installation) are described. The system consists of 50 data measuring lines from which 30 are used to monitor by means of radiation de-- tectors; 12- to control the state of branch circuits, and orhers give auxiliary information on the accelerator performance. The data are handled and registered by a crate controller with built-in microcomputer once in some seconds. The monitoring results are output on a special light panel, a sound signaling and on a print

  12. Collecting Patient Data from Sensor-Based Systems: Benefits and Challenges.

    Science.gov (United States)

    Gashgari, Horeya; Attallah, Nora; Al Muallem, Yahya; Al Dogether, Majed; Al Moammary, Eman; Almeshari, Meshari; Househ, Mowafa

    2016-01-01

    The purpose of this study is to review the literature on the use of sensor-based technology in the collection of healthcare data from health consumers and/or patients. A literature search was conducted in November 2015 through the PubMed database. A total of 4,800 articles were retrieved using the terms "sensors-based systems in healthcare" and "sensor monitoring in healthcare". After scanning the articles, and applying the inclusion and exclusion criteria, 10 articles were found relevant and included in the review. This study highlights the benefits and challenges when using sensor-based systems in the collection of health consumer and/or patient information. Some of the benefits in the collection of data are remote monitoring features and the real-time data collection features. Some of the challenges are privacy, security, and sensitivity to technology issues. Future work should evaluate the quality of the research evidence on sensor-based technologies and how such information impacts quality of care.

  13. Collecting Patient Data from Sensor-Based Systems: Benefits and Challenges.

    Science.gov (United States)

    Gashgari, Horeya; Attallah, Nora; Al Muallem, Yahya; Al Dogether, Majed; Al Moammary, Eman; Almeshari, Meshari; Househ, Mowafa

    2016-01-01

    The purpose of this study is to review the literature on the use of sensor-based technology in the collection of healthcare data from health consumers and/or patients. A literature search was conducted in November 2015 through the PubMed database. A total of 4,800 articles were retrieved using the terms "sensors-based systems in healthcare" and "sensor monitoring in healthcare". After scanning the articles, and applying the inclusion and exclusion criteria, 10 articles were found relevant and included in the review. This study highlights the benefits and challenges when using sensor-based systems in the collection of health consumer and/or patient information. Some of the benefits in the collection of data are remote monitoring features and the real-time data collection features. Some of the challenges are privacy, security, and sensitivity to technology issues. Future work should evaluate the quality of the research evidence on sensor-based technologies and how such information impacts quality of care. PMID:27350461

  14. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    OpenAIRE

    Boden, T. A.; M. Krassovski; Yang, B.

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in Nor...

  15. Detection of impulsive sources from an aerostat-based acoustic array data collection system

    Science.gov (United States)

    Prather, Wayne E.; Clark, Robert C.; Strickland, Joshua; Frazier, Wm. Garth; Singleton, Jere

    2009-05-01

    An aerostat based acoustic array data collection system was deployed at the NATO TG-53 "Acoustic Detection of Weapon Firing" Joint Field Experiment conducted in Bourges, France during the final two weeks of June 2008. A variety of impulsive sources including mortar, artillery, gunfire, RPG, and explosive devices were fired during the test. Results from the aerostat acoustic array will be presented against the entire range of sources.

  16. A mobile field-work data collection system for the wireless era of health surveillance

    Directory of Open Access Journals (Sweden)

    Marianne Forsell

    2011-02-01

    Full Text Available In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on field-work experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS to a Local Area Network (LAN interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from field-work data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  17. An automatic visual analysis system for tennis

    OpenAIRE

    Connaghan, Damien; Moran, Kieran; O''Connor, Noel E.

    2013-01-01

    This article presents a novel video analysis system for coaching tennis players of all levels, which uses computer vision algorithms to automatically edit and index tennis videos into meaningful annotations. Existing tennis coaching software lacks the ability to automatically index a tennis match into key events, and therefore, a coach who uses existing software is burdened with time-consuming manual video editing. This work aims to explore the effectiveness of a system to automatically de...

  18. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  19. Calibration of a system to collect visible-light polarization data for classification of geosynchronous satellites

    Science.gov (United States)

    Speicher, Andy; Matin, Mohammad; Tippets, Roger; Chun, Francis

    2014-09-01

    In order to protect critical military and commercial space assets, the United States Space Surveillance Network must have the ability to positively identify and characterize all space objects. Unfortunately, positive identification and characterization of space objects is a manual and labor intensive process today since even large telescopes cannot provide resolved images of most space objects. The objective of this study was to calibrate a system to exploit the optical signature of unresolved geosynchronous satellite images by collecting polarization data in the visible wavelengths for the purpose of revealing discriminating features. These features may lead to positive identification or classification of each satellite. The system was calibrated with an algorithm and process that takes raw observation data from a two-channel polarimeter and converts it to Stokes parameters S0 and S1. This instrumentation is a new asset for the United States Air Force Academy (USAFA) Department of Physics and consists of one 20-inch Ritchey-Chretien telescope and a dual focal plane system fed with a polarizing beam splitter. This study calibrated the system and collected preliminary polarization data on five geosynchronous satellites to validate performance. Preliminary data revealed that each of the five satellites had a different polarization signature that could potentially lead to identification in future studies.

  20. GAIT-ER-AID: An Expert System for Analysis of Gait with Automatic Intelligent Pre-Processing of Data

    OpenAIRE

    Bontrager, EL.; Perry, J.; Bogey, R.; Gronley, J.; Barnes, L.; Bekey, G.; Kim, JW

    1990-01-01

    This paper describes the architecture and applications of an expert system designed to identify the specific muscles responsible for a given dysfunctional gait pattern. The system consists of two parts: a data analysis expert system (DA/ES) and a gait pathology expert system (GP/ES). The DA/ES processes raw data on joint angles, foot-floor contact patterns and EMG's from relevant muscles and synthesizes them into a data frame for use by the GP/ES. Various aspects of the intelligent data pre-p...

  1. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  2. Digital signal processing for CdTe detectors using VXIbus data collection systems

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, Daiji; Takahashi, Hiroyuki; Kurahashi, Tomohiko; Iguchi, Tetsuo; Nakazawa, Masaharu

    1996-07-01

    Recently fast signal digitizing technique has been developed, and signal waveforms with very short time periods can be obtained. In this paper, we analyzed each measured pulse which was digitized by an apparatus of this kind, and tried to improve an energy resolution of a CdTe semiconductor detector. The result of the energy resolution for {sup 137}Cs 662 keV photopeak was 13 keV. Also, we developed a fast data collection system based on VXIbus standard, and the counting rate on this system was obtained about 50 counts per second. (author)

  3. ENT COBRA (Consortium for Brachytherapy Data Analysis): interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy)

    Science.gov (United States)

    Tagliaferri, Luca; Kovács, György; Budrukkar, Ashwini; Guinot, Jose Luis; Hildebrand, Guido; Johansson, Bengt; Monge, Rafael Martìnez; Meyer, Jens E.; Niehoff, Peter; Rovirosa, Angeles; Takàcsi-Nagy, Zoltàn; Dinapoli, Nicola; Lanzotti, Vito; Damiani, Andrea; Soror, Tamer; Valentini, Vincenzo

    2016-01-01

    Purpose Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. Material and methods GEC-ESTRO (Groupe Européen de Curiethérapie – European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Results Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis), Procedures (prediction models and DSS), and Research (radiomics). The COBRA-Storage System (C-SS) is not time-consuming as, thanks to the use of “brokers”, data can be extracted directly from the single center's storage systems through a connection with “structured query language database” (SQL-DB), Microsoft Access®, FileMaker Pro®, or Microsoft Excel®. The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of “on-purpose data projection”. The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called “distributed learning” approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Conclusions Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center's own data storing

  4. Operating manual for the digital data-collection system for flow-control structures

    Science.gov (United States)

    Rorabaugh, J.I.; Rapp, W.L.

    1986-01-01

    This manual was written to help the user operate and maintain the digital data collection system for flow control structures. The system is used to measure daily discharge through river control dams. These dams commonly have tainter gates which are raised and lowered to keep the upper pool level relatively constant as the river flow changes. In order to measure the flow through such a structure, the positions of the tainter gates and the headwater and tailwater elevations must be known. From these data, the flow through the structure can be calculated. A typical digital data collection system is shown. Digitizing devices are mounted on the hoisting mechanism of each gate, as well as at the headwater and tailwater gages. Data from these digitizers are then routed by electrical cables to a central console where they are displayed and recorded on paper tape. If the dam has locks, a pressure-sensitive switch located in the lock activates a counter in the console which keeps track of the number of times the lock is drained and filled. (USGS)

  5. The AmeriFlux Data Activity and Data System: An Evolving Collection of Data Management Techniques, Tools, Products and Services

    Energy Technology Data Exchange (ETDEWEB)

    Boden, Thomas A [ORNL; Krassovski, Misha B [ORNL; Yang, Bai [ORNL

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the U.S. Department of Energy and international climate change science since 1982. Over this period, climate change science has expanded from research focusing on basic understanding of geochemical cycles, particularly the carbon cycle, to integrated research addressing climate change impacts, vulnerability, adaptation, and mitigation. Interests in climate change data and information worldwide have grown remarkably and, as a result, so have demands and expectations for CDIAC s data systems. To meet the growing demands, CDIAC s strategy has been to design flexible data systems using proven technologies blended with new, evolving technologies and standards. CDIAC development teams are multidisciplinary and include computer science and information technology expertise, but also scientific expertise necessary to address data quality and documentation issues and to identify data products and system capabilities needed by climate change scientists. CDIAC has learned there is rarely a single commercial tool or product readily available to satisfy long-term scientific data system requirements (i.e., one size does not fit all and the breadth and diversity of environmental data are often too complex for easy use with commercial products) and typically deploys a variety of tools and data products in an effort to provide credible data freely to users worldwide. Like many scientific data management applications, CDIAC s data systems are highly customized to satisfy specific scientific usage requirements (e.g., developing data products specific for model use) but are also designed to be flexible and interoperable to take advantage of new software engineering techniques, standards (e.g., metadata standards) and tools and to support future Earth system data efforts (e.g., ocean acidification). CDIAC has provided data management

  6. Creating an iPhone Application for Collecting Continuous ABC Data

    Science.gov (United States)

    Whiting, Seth W.; Dixon, Mark R.

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…

  7. Electronic thermal sensor and Data Collection Platform technology: Part 5 in Thermal surveillance of active volcanoes using the Landsat-1 Data Collection System

    Science.gov (United States)

    Preble, Duane M.; Friedman, Jules D.; Frank, David

    1976-01-01

    Five Data Collection Platforms (DCP) were integrated electronically with thermall sensing systems, emplaced and operated in an analog mode at selected thermally significant volcanic and geothermal sites. The DCP's transmitted 3260 messages comprising 26,080 ambient, surface, and near-surface temperature records at an accuracy of ±1.15 °C for 1121 instrument days between November 14, 1972 and April 17, 1974. In harsh, windy, high-altitude volcanic environments the DCP functioned best with a small dipole antenna. Sixteen kg of alkaline batteries provided a viable power supply for the DCP systems, operated at a low-duty cycle, for 5 to 8 months. A proposed solar power supply system would lengthen the period of unattended operation of the system considerably. Special methods of data handling such as data storage via a proposed memory system would increase the significance of the twice-daily data reception enabling the DCP's to record full diurnal-temperature cycles at volcanic or geothermal sites. Refinements in the temperature-monitoring system designed and operated in experiment SR 251 included a backup system consisting of a multipoint temperature scanner, a servo mechanism and an analog-to-digital recorder. Improvements were made in temperature-probe design and in construction of corrosion-resistant seals by use of a hydrofluoric-acid-etching technique.

  8. Using global positioning systems in health research a practical approach to data collection and processing

    DEFF Research Database (Denmark)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper

    2011-01-01

    . Recommendations are outlined for each stage of data collection and analysis and indicates challenges that should be considered. This paper highlights the benefits of collecting GPS data over traditional self-report or estimated exposure measures. Information presented here will allow researchers to make......The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons across...... studies and populations, and advance this field more rapidly. This paper aims to take researchers, who are considering using GPS devices in their research, through device-selection criteria, device settings, participant data collection, data cleaning, data processing, and integration of data into GIS...

  9. ASUKA Hydrographic Data Collection

    OpenAIRE

    Uchida, Hiroshi; Imawaki, Shiro; Ichikawa, Hiroshi

    2008-01-01

    Repeated hydrographic surveys across the Kuroshio and its recirculation south of Japan were carried out by a group called ASUKA (Affiliated Surveys of the Kuroshio off Cape Ashizuri) since 1992, Conductivity-temperature-depth profiler (CTD), expendable CTD (XCTD), expendable bathythermograph (XBT), and digital bathythermograph (DBT) data obtained from 155 cruises were collected for a period of 16 years, from November 1992 to May 2008. A uniform data processing was applied to raw data from the...

  10. Creating an iPhone application for collecting continuous ABC data.

    Science.gov (United States)

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.

  11. Data collection systems in ART must follow the pace of change in clinical practice.

    Science.gov (United States)

    De Geyter, Ch; Wyns, C; Mocanu, E; de Mouzon, J; Calhaz-Jorge, C

    2016-10-01

    In assisted reproductive technology (ART), quality control necessitates the collection of outcome data and occurring complications. Traditional quality assurance is based on data derived from single ART centres and more recently from national registries, both recording outcome parameters during well-defined observation periods. Nowadays, ART is moving towards much more diverse approaches, with sequential activities including short- or long-term freezing of gametes, gonadal tissues and embryos, and cross-border reproductive care. Hence, long-term cumulative treatment rates and an international approach are becoming a necessity. We suggest the initiation of an easy access European Reproductive Coding System, through which each ART recipient is allocated a unique reproductive care code. This code would identify individuals (and reproductive material) during case to case data reporting to national ART data collecting institutions and to a central European ART monitoring agency. For confidentiality reasons, the identity of the individuals should remain with the local ART provider. This way, cumulative and fully reliable reproductive outcome data can be constructed with follow-up over prolonged time periods.

  12. Programs for the automatic gamma-ray measurement with CANBERRA 8100/QUANTA system

    International Nuclear Information System (INIS)

    Some programs have been prepared for the automatic operation of the CANBERRA 8100/QUANTA System for the gamma-ray spectrum measurement. The main parts of these programs are: (1) to collect and record on magnetic disks the data of gamma-ray spectra automatically, while the recorded data are analyzed to estimate the nuclides which generate photopeaks of spectra and to calculate those concentrations; (2) to draw plotted diagrams of pulse height distributions of gamma-ray spectra data and other data by the additional digital plotter; and etc. (author)

  13. GIS: Geographic Information System An application for socio-economical data collection for rural area

    CERN Document Server

    Nayak, S K; Kalyankar, N V

    2010-01-01

    The country India follows the planning through planning commission. This is on the basis of information collected by traditional, tedious and manual method which is too slow to sustain. Now we are in the age of 21th century. We have seen in last few decades that the progress of information technology with leaps and bounds, which have completely changed the way of life in the developed nations. While internet has changed the established working practice and opened new vistas and provided a platform to connect, this gives the opportunity for collaborative work space that goes beyond the global boundary. We are living in the global economy and India leading towards Liberalize Market Oriented Economy (LMOE). Considering this things, focusing on GIS, we proposed a system for collection of socio economic data and water resource management information of rural area via internet.

  14. 次声信号数据采集系统的研制%Development of infrasound signal data collecting system

    Institute of Scientific and Technical Information of China (English)

    易南; 陈景藻; 李玲; 贾克勇

    2001-01-01

    Infrasound signal frequency of 0~20 Hz was collected byinfrasound signal data collecting system. Sound collecting apparatus transformed the signal into voltage signal correspondingly. The computer analyzed the mainly frequency and the density of the infrasound and also illustrated the results and gave printing curves automatically.%次声信号数据采集系统所采集的是0~20Hz的次声信号,该信号通过传声器转换为相应的电压信号.计算机实时采集、处理次声信号,分析次声信号的各主要频率成分和强度大小,对分析结果进行图形显示,曲线自动输出,并生成、打印最终结果.

  15. Profiling animal toxicants by automatically mining public bioassay data: a big data approach for computational toxicology.

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    Full Text Available In vitro bioassays have been developed and are currently being evaluated as potential alternatives to traditional animal toxicity models. Already, the progress of high throughput screening techniques has resulted in an enormous amount of publicly available bioassay data having been generated for a large collection of compounds. When a compound is tested using a collection of various bioassays, all the testing results can be considered as providing a unique bio-profile for this compound, which records the responses induced when the compound interacts with different cellular systems or biological targets. Profiling compounds of environmental or pharmaceutical interest using useful toxicity bioassay data is a promising method to study complex animal toxicity. In this study, we developed an automatic virtual profiling tool to evaluate potential animal toxicants. First, we automatically acquired all PubChem bioassay data for a set of 4,841 compounds with publicly available rat acute toxicity results. Next, we developed a scoring system to evaluate the relevance between these extracted bioassays and animal acute toxicity. Finally, the top ranked bioassays were selected to profile the compounds of interest. The resulting response profiles proved to be useful to prioritize untested compounds for their animal toxicity potentials and form a potential in vitro toxicity testing panel. The protocol developed in this study could be combined with structure-activity approaches and used to explore additional publicly available bioassay datasets for modeling a broader range of animal toxicities.

  16. An integrated system for managing multidisciplinary oceanographic data collected in the Mediterranean Sea during the basin-scale research project EU/MAST-MATER (1996 2000)

    Science.gov (United States)

    Maillard, C.; Balopoulos, E.; Giorgetti, A.; Fichaut, M.; Iona, A.; Larour, M.; Latrouite, A.; Manca, B.; Maudire, G.; Nicolas, P.; Sanchez-Cabeza, J.-A.

    2002-06-01

    An advanced computer and communication technology was used to develop an integrated system and software tools for managing a great diversity of oceanographic data collected in the Mediterranean Sea during 1996-2000. Data were obtained during 108 sea cruises, carried out within the framework of the large-scale international research project MATER (mass transfer and ecosystem response), which was financially supported by the Marine Science and Technology (MAST) Programme of the European Union (EU). Data collection involved the active participation of various research vessels and personnel coming from 58 different laboratories of 13 countries. Data formatting as well as automatic and visual data quality controls were implemented using internationally accepted standards and procedures. Various data inventories and meta-data information, accessible through the World Wide Web (WWW), are made available to the user community. A database was developed, which, along with meta-data and other data relevant to the project information, is made available to the user community in the form of a CD-ROM. The database consists of 5861 vertical profiles and 842 time series of basic physical and biogeochemical parameters collected in the seawater column as well as biogeochemical parameters from the analysis of 70 sediment cores. Furthermore, it includes 67 cruise data files of nonstandard additional biological and atmospheric parameters.

  17. Overview of the SOFIA Data Processing System: A generalized system for manual and automatic data processing at the SOFIA Science Center

    CERN Document Server

    Shuping, R Y; Vacca, W D; Charcos-Llorens, M; Reach, W T; Alles, R; Clarke, M; Melchiorri, R; Radomski, J; Shenoy, S; Sandel, D; Omelian, E B

    2014-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5-meter telescope mounted in the aft section of a Boeing 747SP aircraft. During routine operations, several instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR. Raw data obtained in-flight require a significant amount of processing to correct for background emission (from both the telescope and atmosphere), remove instrumental artifacts, correct for atmospheric absorption, and apply both wavelength and flux calibration. In general, this processing is highly specific to the instrument and telescope. In order to maximize the scientific output of the observatory, the SOFIA Science Center must provide these post-processed data sets to Guest Investigators in a timely manner. To meet this requirement, we have designed and built the SOFIA Data Processing System (DPS): an in-house set of tools and services that can be used in both auto...

  18. Issues of data collection and use for quantifying the impacts of energy installations and systems

    International Nuclear Information System (INIS)

    The paper discusses several critical issues in the construction of models for assessing the impacts of energy installations and systems. Some of these are connected with the process of data collection and use; it is pointed out that different methods have to be applied according to the purpose envisaged for a particular study (e.g. plant licensing, environmental assessment or energy planning). Further concerns are discussed, related to the common need to aggregate data and, in relation to the actual client or target group of the work, to present results in a sufficiently generic fashion. The paper discusses aggregation over technologies, over sites, over time and over social settings. Regarding the actual technique used for impact calculations, the differences between externality calculations, extended risk analysis and life cycle analysis are described. Finally, the issue of quantification is illustrated by two very difficult but also very important examples: global climate impacts and impacts of nuclear accidents. (author). 9 refs, 1 fig., 2 tabs

  19. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  20. Feasibility Study for Ballet E-Learning: Automatic Composition System for Ballet "Enchainement" with Online 3D Motion Data Archive

    Science.gov (United States)

    Umino, Bin; Longstaff, Jeffrey Scott; Soga, Asako

    2009-01-01

    This paper reports on "Web3D dance composer" for ballet e-learning. Elementary "petit allegro" ballet steps were enumerated in collaboration with ballet teachers, digitally acquired through 3D motion capture systems, and categorised into families and sub-families. Digital data was manipulated into virtual reality modelling language (VRML) and fit…

  1. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship `Mutsu`. Automation technique was verified by `Mutsu` plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  2. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  3. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  4. Data mining of geospatial data: combining visual and automatic methods

    OpenAIRE

    Demšar, Urška

    2006-01-01

    Most of the largest databases currently available have a strong geospatial component and contain potentially useful information which might be of value. The discipline concerned with extracting this information and knowledge is data mining. Knowledge discovery is performed by applying automatic algorithms which recognise patterns in the data. Classical data mining algorithms assume that data are independently generated and identically distributed. Geospatial data are multidimensional, spatial...

  5. Computer systems for automatic earthquake detection

    Science.gov (United States)

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  6. Collective Analysis of Qualitative Data

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Friberg, Karin

    2014-01-01

    What. Many students and practitioners do not know how to systematically process qualitative data once it is gathered—at least not as a collective effort. This chapter presents two workshop techniques, affinity diagramming and diagnostic mapping, that support collective analysis of large amounts...... of qualitative data. Affinity diagramming is used to make collective analysis and interpretations of qualitative data to identify core problems that need to be addressed in the design process. Diagnostic mapping supports collective interpretation and description of these problems and how to intervene in them. We...... explain the techniques through a case where they were used to analyze why a new elec- tronic medical record system introduced life-threatening situations for patients. Why. Collective analyses offer all participants a voice, visualize their contributions, combine different actors’ perspectives, and anchor...

  7. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    Science.gov (United States)

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  8. DataCollection Prototyping

    CERN Multimedia

    Beck, H.P.

    DataCollection is a subsystem of the Trigger, DAQ & DCS project responsible for the movement of event data from the ROS to the High Level Triggers. This includes data from Regions of Interest (RoIs) for Level 2, building complete events for the Event Filter and finally transferring accepted events to Mass Storage. It also handles passing the LVL1 RoI pointers and the allocation of Level 2 processors and load balancing of Event Building. During the last 18 months DataCollection has developed a common architecture for the hardware and software required. This involved a radical redesign integrating ideas from separate parts of earlier TDAQ work. An important milestone for this work, now achieved, has been to demonstrate this subsystem in the so-called Phase 2A Integrated Prototype. This prototype comprises the various TDAQ hardware and software components (ROSs, LVL2, etc.) under the control of the TDAQ Online software. The basic functionality has been demonstrated on small testbeds (~8-10 processing nodes)...

  9. Practical automatic Arabic license plate recognition system

    Science.gov (United States)

    Mohammad, Khader; Agaian, Sos; Saleh, Hani

    2011-02-01

    Since 1970's, the need of an automatic license plate recognition system, sometimes referred as Automatic License Plate Recognition system, has been increasing. A license plate recognition system is an automatic system that is able to recognize a license plate number, extracted from image sensors. In specific, Automatic License Plate Recognition systems are being used in conjunction with various transportation systems in application areas such as law enforcement (e.g. speed limit enforcement) and commercial usages such as parking enforcement and automatic toll payment private and public entrances, border control, theft and vandalism control. Vehicle license plate recognition has been intensively studied in many countries. Due to the different types of license plates being used, the requirement of an automatic license plate recognition system is different for each country. [License plate detection using cluster run length smoothing algorithm ].Generally, an automatic license plate localization and recognition system is made up of three modules; license plate localization, character segmentation and optical character recognition modules. This paper presents an Arabic license plate recognition system that is insensitive to character size, font, shape and orientation with extremely high accuracy rate. The proposed system is based on a combination of enhancement, license plate localization, morphological processing, and feature vector extraction using the Haar transform. The performance of the system is fast due to classification of alphabet and numerals based on the license plate organization. Experimental results for license plates of two different Arab countries show an average of 99 % successful license plate localization and recognition in a total of more than 20 different images captured from a complex outdoor environment. The results run times takes less time compared to conventional and many states of art methods.

  10. Steam System Balancing and Tuning for Multifamily Residential Buildings in Chicagoland - Second Year of Data Collection

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.; Ludwig, P.; Brand, L.

    2013-08-01

    Steam heated buildings often suffer from uneven heating as a result of poor control of the amount of steam entering each radiator. In order to satisfy the heating load to the coldest units, other units are overheated. As a result, some tenants complain of being too hot and open their windows in the middle of winter, while others complain of being too cold and are compelled to use supplemental heat sources. Building on previous research, CNT Energy identified 10 test buildings in Chicago and conducted a study to identify best practices for the methodology, typical costs, and energy savings associated with steam system balancing. A package of common steam balancing measures was assembled and data were collected on the buildings before and after these retrofits were installed to investigate the process, challenges, and the cost effectiveness of improving steam systems through improved venting and control systems. The test buildings that received venting upgrades and new control systems showed 10.2% savings on their natural gas heating load, with a simple payback of 5.1 years. The methodologies for and findings from this study are presented in detail in this report. This report has been updated from a version published in August 2012 to include natural gas usage information from the 2012 heating season and updated natural gas savings calculations.

  11. Monitoring fish communities at drifting FADs: an autonomous system for data collection in an ecosystems approach

    OpenAIRE

    Brehmer, Patrice; Sancho, Gorka; Josse, Erwan; Taquet, Marc; Georgakarakos, Stratis; Itano, David; Moreno, Gala; Palud, Pierre; Trygonis, Vasilis; Aumeeruddy, Riaz; Girard, Charlotte; Peignon, Christophe; Dalen, John; Dagorn, Laurent

    2009-01-01

    An increasing proportion of landings by tuna purse seine fishing vessels are taken around drifting Fish Aggregating Devices (FADs). Although these FADs and their use by the fishing industry to capture tropical tuna have been well documented, operative tools to collect data around them are now required. Acoustic, video, photographic and visual data were collected on fish aggregations around drifting FADs in offshore waters of the western Indian Ocean. Multibeam sonars, multifreq...

  12. Automatic program debugging for intelligent tutoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Murray, W.R.

    1986-01-01

    This thesis explores the process by which student programs can be automatically debugged in order to increase the instructional capabilities of these systems. This research presents a methodology and implementation for the diagnosis and correction of nontrivial recursive programs. In this approach, recursive programs are debugged by repairing induction proofs in the Boyer-Moore Logic. The potential of a program debugger to automatically debug widely varying novice programs in a nontrivial domain is proportional to its capabilities to reason about computational semantics. By increasing these reasoning capabilities a more powerful and robust system can result. This thesis supports these claims by examining related work in automated program debugging and by discussing the design, implementation, and evaluation of Talus, an automatic degugger for LISP programs. Talus relies on its abilities to reason about computational semantics to perform algorithm recognition, infer code teleology, and to automatically detect and correct nonsyntactic errors in student programs written in a restricted, but nontrivial, subset of LISP.

  13. An observing system for the collection of fishery and oceanographic data

    Directory of Open Access Journals (Sweden)

    P. Falco

    2007-05-01

    Full Text Available Fishery Observing System (FOS was developed as a first and basic step towards fish stock abundance nowcasting/forecasting within the framework of the EU research program Mediterranean Forecasting System: Toward an Environmental Prediction (MFSTEP. The study of the relationship between abundance and environmental parameters also represents a crucial point towards forecasting. Eight fishing vessels were progressively equipped with FOS instrumentation to collect fishery and oceanographic data. The vessels belonged to different harbours of the Central and Northern Adriatic Sea. For this pilot application, anchovy (Engraulis encrasicolus, L. was chosen as the target species. Geo-referenced catch data, associated with in-situ temperature and depth, were the FOS products but other parameters were associated with catch data as well. MFSTEP numerical circulation models provide many of these data. In particular, salinity was extracted from re-analysis data of numerical circulation models. Satellite-derived sea surface temperature (SST and chlorophyll were also used as independent variables. Catch and effort data were used to estimate an abundance index (CPUE – Catch per Unit of Effort. Considering that catch records were gathered by different fishing vessels with different technical characteristics and operating on different fish densities, a standardized value of CPUE was calculated. A spatial and temporal average CPUE map was obtained together with a monthly mean time series in order to characterise the variability of anchovy abundance during the period of observation (October 2003–August 2005. In order to study the relationship between abundance and oceanographic parameters, Generalized Additive Models (GAM were used. Preliminary results revealed a complex scenario: the southern sector of the domain is characterised by a stronger relationship than the central and northern sector where the interactions between the environment and the anchovy

  14. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  15. Automatic dam concrete placing system; Dam concrete dasetsu sagyo no jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    Yoneda, Y.; Hori, Y.; Nakayama, T.; Yoshihara, K.; Hironaka, T. [Okumura Corp., Osaka (Japan)

    1994-11-15

    An automatic concrete placing system was developed for concrete dam construction. This system consists of the following five subsystems: a wireless data transmission system, an automatic dam concrete mixing system, a consistency determination system, an automatic dam concrete loading and transporting system, and a remote concrete bucket opening and closing system. The system includes the following features: mixing amount by mixing ratio and mixing intervals can be instructed from a concrete placing site by using a wireless handy terminal; concrete is mixed automatically in a batcher plant; a transfer car is started, and concrete is charged into a bucket automatically; the mixed concrete is determined of its properties automatically; labor cost can be reduced, the work efficiency improved, and the safety enhanced; and the system introduction has resulted in unattended operation from the aggregate draw-out to a bunker line, manpower saving of five persons, and reduction in cycle time by 10%. 11 figs., 2 tabs.

  16. Robust indexing for automatic data collection

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2003-12-09

    We present improved methods for indexing diffraction patterns from macromolecular crystals. The novel procedures include a more robust way to verify the position of the incident X-ray beam on the detector, an algorithm to verify that the deduced lattice basis is consistent with the observations, and an alternative approach to identify the metric symmetry of the lattice. These methods help to correct failures commonly experienced during indexing, and increase the overall success rate of the process. Rapid indexing, without the need for visual inspection, will play an important role as beamlines at synchrotron sources prepare for high-throughput automation.

  17. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  18. Automatic molecular collection and detection by using fuel-powered microengines

    Science.gov (United States)

    Han, Di; Fang, Yangfu; Du, Deyang; Huang, Gaoshan; Qiu, Teng; Mei, Yongfeng

    2016-04-01

    We design and fabricate a simple self-powered system to collect analyte molecules in fluids for surface-enhanced Raman scattering (SERS) detection. The system is based on catalytic Au/SiO/Ti/Ag-layered microengines by employing rolled-up nanotechnology. Pronounced SERS signals are observed on microengines with more carrier molecules compared with the same structure without automatic motions.We design and fabricate a simple self-powered system to collect analyte molecules in fluids for surface-enhanced Raman scattering (SERS) detection. The system is based on catalytic Au/SiO/Ti/Ag-layered microengines by employing rolled-up nanotechnology. Pronounced SERS signals are observed on microengines with more carrier molecules compared with the same structure without automatic motions. Electronic supplementary information (ESI) available: Experimental procedures, characterization, SERS enhancement factor calculation and videos. See DOI: 10.1039/c6nr00117c

  19. Temperature Profile Data Collected by Participating Ships in NOAA's Shipboard Environmental Data Acquisition System Program from 17 June 2000 to 23 February 2001 (NODC Accession 0000417)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — XBT and other data were collected from the COLUMBUS COROMANDEL and other platforms participating in NOAA's Shipboard Environmental Data Acquisition System (SEAS)...

  20. 49 CFR Appendix H to Part 40 - DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form

    Science.gov (United States)

    2010-10-01

    ..., App. H Appendix H to Part 40—DOT Drug and Alcohol Testing Management Information System (MIS) Data... 49 Transportation 1 2010-10-01 2010-10-01 false DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form H Appendix H to Part 40 Transportation Office of the...

  1. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and...

  2. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  3. Soft design for automatic detection system of flight test data%飞行试验数据自检测系统软件设计

    Institute of Scientific and Technical Information of China (English)

    许应康; 彭国金; 刘威

    2015-01-01

    在飞行试验过程中,因缺乏对海量试飞原始数据和预处理结果数据的快速检查和故障定位,导致试飞工程师分析数据的难度急剧增加,影响试飞型号任务的高效进行。在此针对上述问题对飞行试验数据进行自检测技术研究,设计了一个基于参数信息专家数据库的原始数据与预处理结果数据的自动化检测系统软件。该软件可对原始数据进行自动化检测,同时依据专家数据库的自定义判据,对预处理结果数据进行自动化检测和计算处理。经过软件测试和应用,该软件能够有效地解决原始数据和预处理结果数据中的数据异常和错误,提高试飞工程师对海量试飞数据的分析效率。%In the flight test,the difficulty of data analysis is sharply rising due to the lack of quick detecting and fault posi⁃tioning to huge raw and preprocessed data,which prevents the flight test mission from being executed effectively. The software for an automatic detection system of original data and preprocessed result data based on expert database of parameter information was designed for solving the problems above. The software can check original data and preprocessed data automatically according to the self⁃defined criterion of the expert database. The testing result and application of the software show that the software can effectively deal with data exception and error in original data and preprocessed result data,and improve the analysis efficiency of flight engineers for huge flight test data.

  4. Revivification of Intelligence Data Collection System Based on SCM%基于单片机的智能数据采集系统

    Institute of Scientific and Technical Information of China (English)

    徐淑彦; 李世雄; 苏亦白

    2011-01-01

    随着经济的飞速发展和科学技术水平的不断提高,智能数据采集系统在工业生产以及科学研究中得到了广泛的应用.在信息化时代,数据和信息无疑成为一种重要的资源,而数据采集系统的出现更是进一步促进了人机交互、对设备的自动检测控制等的实现,为现代化工业生产提供了方便.本文将分析基于单片机的智能数据采集系统的研究必要性,阐述基于单片机的智能数据采集系统的设计要点及其具体方法,以期对基于单片机的智能数据采集系统的改造和创新做出应有的贡献.%With the rapid development of economy and the increase of scientific and technological level, intelligence data collection system is widely used in the industrial production and scientific studies. In the information age, data and information had become the important resource and data collection system further promotes the man-machine interaction and automatic detection control of the device, providing convenience for the modern industrial production. This article examines necessity of the research on intelligence data collection system based on SCM, and expounds the key points of design and the methods, to contribute to the reform and innovation of intelligent data collection system based on SCM.

  5. Feedback Improvement in Automatic Program Evaluation Systems

    Science.gov (United States)

    Skupas, Bronius

    2010-01-01

    Automatic program evaluation is a way to assess source program files. These techniques are used in learning management environments, programming exams and contest systems. However, use of automated program evaluation encounters problems: some evaluations are not clear for the students and the system messages do not show reasons for lost points.…

  6. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    2004-01-01

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitte

  7. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  8. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  9. A Bottom-Up Approach for Automatically Grouping Sensor Data Layers by their Observed Property

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-01-01

    Full Text Available The Sensor Web is a growing phenomenon where an increasing number of sensors are collecting data in the physical world, to be made available over the Internet. To help realize the Sensor Web, the Open Geospatial Consortium (OGC has developed open standards to standardize the communication protocols for sharing sensor data. Spatial Data Infrastructures (SDIs are systems that have been developed to access, process, and visualize geospatial data from heterogeneous sources, and SDIs can be designed specifically for the Sensor Web. However, there are problems with interoperability associated with a lack of standardized naming, even with data collected using the same open standard. The objective of this research is to automatically group similar sensor data layers. We propose a methodology to automatically group similar sensor data layers based on the phenomenon they measure. Our methodology is based on a unique bottom-up approach that uses text processing, approximate string matching, and semantic string matching of data layers. We use WordNet as a lexical database to compute word pair similarities and derive a set-based dissimilarity function using those scores. Two approaches are taken to group data layers: mapping is defined between all the data layers, and clustering is performed to group similar data layers. We evaluate the results of our methodology.

  10. Multiple on-line data collection and processing for radioimmunoassy using a micro-computer system.

    OpenAIRE

    Carter, N. W.; Davidson, D.; Lucas, D F; Griffiths, P.D.

    1980-01-01

    A micro-computer system is described which has been designed to perform on-line data capture from up to seven radioisotope counters of different types in parallel with interactive results processing and subsequent transmission to a laboratory computer-based data management system.

  11. Lightweight Vertical Take-Off & Landing Unmanned Aerial Systems For Local-Scale Forestry and Agriculture Remote Sensing Data Collection

    Science.gov (United States)

    Putman, E.; Sheridan, R.; Popescu, S. C.

    2015-12-01

    The evolution of lightweight Vertical Take-Off and Landing (VTOL) rotary Unmanned Aerial Vehicles (UAVs) and remote sensor technologies have provided researchers with the ability to integrate compact remote sensing systems with UAVs to create Unmanned Aerial Systems (UASs) capable of collecting high-resolution airborne remote sensing data. UASs offer a myriad of benefits. Some of the most notable include: (1) reduced operational cost; (2) reduced lead-time for mission planning; (3) high-resolution and high-density data collection; and (4) customization of data collection intervals to fit the needs of a specific project (i.e. acquiring data at hourly, daily, or weekly intervals). Such benefits allow researchers and natural resource managers to acquire airborne remote sensing data on local-scale phenomenon in ways that were previously cost-prohibitive. VTOL UASs also offer a stable platform capable of low speed low altitude flight over small spatial scales that do not require a dedicated runway. Such flight characteristics allow VTOL UASs to collect high-resolution data at very high densities, enabling the use of structure from motion (SFM) techniques to generate three-dimensional datasets from photographs. When combined, these characteristics make VTOL UASs ideal for collecting data over agricultural or forested research areas. The goal of this study is to provide an overview of several lightweight eight-rotor VTOL UASs designed for small-scale forest remote sensing data collection. Specific objectives include: (1) the independent integration of a lightweight multispectral camera, a lightweight scanning lidar sensor, with required components (i.e. IMU, GPS, data logger) and the UAV; (2) comparison of UAS-collected data to terrestrial lidar data and airborne multispectral and lidar data; (3) comparison of UAS SFM techniques to terrestrial lidar data; and (4) multi-temporal assessment of tree decay using terrestrial lidar and UAS SfM techniques.

  12. Tightly integrated single- and multi-crystal data collection strategy calculation and parallelized data processing in JBluIce beamline control system.

    Science.gov (United States)

    Pothineni, Sudhir Babu; Venugopalan, Nagarajan; Ogata, Craig M; Hilgart, Mark C; Stepanov, Sergey; Sanishvili, Ruslan; Becker, Michael; Winter, Graeme; Sauter, Nicholas K; Smith, Janet L; Fischetti, Robert F

    2014-12-01

    The calculation of single- and multi-crystal data collection strategies and a data processing pipeline have been tightly integrated into the macromolecular crystallographic data acquisition and beamline control software JBluIce. Both tasks employ wrapper scripts around existing crystallographic software. JBluIce executes scripts through a distributed resource management system to make efficient use of all available computing resources through parallel processing. The JBluIce single-crystal data collection strategy feature uses a choice of strategy programs to help users rank sample crystals and collect data. The strategy results can be conveniently exported to a data collection run. The JBluIce multi-crystal strategy feature calculates a collection strategy to optimize coverage of reciprocal space in cases where incomplete data are available from previous samples. The JBluIce data processing runs simultaneously with data collection using a choice of data reduction wrappers for integration and scaling of newly collected data, with an option for merging with pre-existing data. Data are processed separately if collected from multiple sites on a crystal or from multiple crystals, then scaled and merged. Results from all strategy and processing calculations are displayed in relevant tabs of JBluIce.

  13. From Automatic to Adaptive Data Acquisition

    DEFF Research Database (Denmark)

    Chang, Marcus

    2009-01-01

    Sensornets have been used for ecological monitoring the past decade, yet the main driving force behind these deployments are still computer scien- tists. The denser sampling and added modalities oered by sensornets could drive these elds in new directions, but not until the domain scientists be......- come familiar with sensornets and use them as any other instrument in their toolbox. We explore three dierent directions in which sensornets can become easier to deploy, collect data of higher quality, and oer more exibility, and we postulate that sensornets should be instruments for domain scientists...... the exibility of sensornets and reduce the complexity for the domain scientist, we developed an AI-based controller to act as a proxy between the scientist and sensornet. This controller is driven by the scientist's requirements to the collected data, and uses adaptive sampling in order to reach these goals....

  14. Automatic Positioning System of Small Agricultural Robot

    Science.gov (United States)

    Momot, M. V.; Proskokov, A. V.; Natalchenko, A. S.; Biktimirov, A. S.

    2016-08-01

    The present article discusses automatic positioning systems of agricultural robots used in field works. The existing solutions in this area have been analyzed. The article proposes an original solution, which is easy to implement and is characterized by high- accuracy positioning.

  15. Precision laser automatic tracking system.

    Science.gov (United States)

    Lucy, R F; Peters, C J; McGann, E J; Lang, K T

    1966-04-01

    A precision laser tracker has been constructed and tested that is capable of tracking a low-acceleration target to an accuracy of about 25 microrad root mean square. In tracking high-acceleration targets, the error is directly proportional to the angular acceleration. For an angular acceleration of 0.6 rad/sec(2), the measured tracking error was about 0.1 mrad. The basic components in this tracker, similar in configuration to a heliostat, are a laser and an image dissector, which are mounted on a stationary frame, and a servocontrolled tracking mirror. The daytime sensitivity of this system is approximately 3 x 10(-10) W/m(2); the ultimate nighttime sensitivity is approximately 3 x 10(-14) W/m(2). Experimental tests were performed to evaluate both dynamic characteristics of this system and the system sensitivity. Dynamic performance of the system was obtained, using a small rocket covered with retroreflective material launched at an acceleration of about 13 g at a point 204 m from the tracker. The daytime sensitivity of the system was checked, using an efficient retroreflector mounted on a light aircraft. This aircraft was tracked out to a maximum range of 15 km, which checked the daytime sensitivity of the system measured by other means. The system also has been used to track passively stars and the Echo I satellite. Also, the system tracked passively a +7.5 magnitude star, and the signal-to-noise ratio in this experiment indicates that it should be possible to track a + 12.5 magnitude star.

  16. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850...

  17. Automatic systems win; Siegeszug der Automaten

    Energy Technology Data Exchange (ETDEWEB)

    Sorg, M

    2001-07-01

    This short article presents figures on the increasing use of modern, automatic wood-fired heating systems in Switzerland that are not only replacing older installations but also starting to replace other forms of heating. The increase of the number of wood-based heating systems installed and the amount of wood used in them is discussed, as are developments in the market for large-scale wood-based heating systems.

  18. Automatic Road Sign Inventory Using Mobile Mapping Systems

    Science.gov (United States)

    Soilán, M.; Riveiro, B.; Martínez-Sánchez, J.; Arias, P.

    2016-06-01

    The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS) use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS), thus, road sign information can be available to be used in a Smart City context.

  19. Recent developments in the Los Alamos National Laboratory Plutonium Facility Waste Tracking System-automated data collection pilot project

    International Nuclear Information System (INIS)

    The waste management and environmental compliance group (NMT-7) at the Los Alamos National Laboratory has initiated a pilot project for demonstrating the feasibility and utility of automated data collection as a solution for tracking waste containers at the Los Alamos National Laboratory Plutonium Facility. This project, the Los Alamos Waste Tracking System (LAWTS), tracks waste containers during their lifecycle at the facility. LAWTS is a two-tiered system consisting of a server/workstation database and reporting engine and a hand-held data terminal-based client program for collecting data directly from tracked containers. New containers may be added to the system from either the client unit or from the server database. Once containers are in the system, they can be tracked through one of three primary transactions: Move, Inventory, and Shipment. Because LAWTS is a pilot project, it also serves as a learning experience for all parties involved. This paper will discuss many of the lessons learned in implementing a data collection system in the restricted environment. Specifically, the authors will discuss issues related to working with the PPT 4640 terminal system as the data collection unit. They will discuss problems with form factor (size, usability, etc.) as well as technical problems with wireless radio frequency functions. They will also discuss complications that arose from outdoor use of the terminal (barcode scanning failures, screen readability problems). The paper will conclude with a series of recommendations for proceeding with LAWTS based on experience to date

  20. Automatic Irrigation System using WSNs

    OpenAIRE

    Ravinder Singh Dhanoa1; Ravinder Singh

    2014-01-01

    During the entire, I went through various electronics equipment for the project. I learned about Controller 8051, Contact type sensors, Comparator and a little about other electrical equipments. Irrigation systems are as old as man itself since agriculture is the foremost occupation of civilized humanity. To irrigate large areas of plants is an onerous job. In order to overcome this problem many irrigation scheduling techniques have been developed which are mainly based on mon...

  1. All-optical automatic pollen identification: Towards an operational system

    Science.gov (United States)

    Crouzy, Benoît; Stella, Michelle; Konzelmann, Thomas; Calpini, Bertrand; Clot, Bernard

    2016-09-01

    We present results from the development and validation campaign of an optical pollen monitoring method based on time-resolved scattering and fluorescence. Focus is first set on supervised learning algorithms for pollen-taxa identification and on the determination of aerosol properties (particle size and shape). The identification capability provides a basis for a pre-operational automatic pollen season monitoring performed in parallel to manual reference measurements (Hirst-type volumetric samplers). Airborne concentrations obtained from the automatic system are compatible with those from the manual method regarding total pollen and the automatic device provides real-time data reliably (one week interruption over five months). In addition, although the calibration dataset still needs to be completed, we are able to follow the grass pollen season. The high sampling from the automatic device allows to go beyond the commonly-presented daily values and we obtain statistically significant hourly concentrations. Finally, we discuss remaining challenges for obtaining an operational automatic monitoring system and how the generic validation environment developed for the present campaign could be used for further tests of automatic pollen monitoring devices.

  2. 29 CFR 42.21 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Data collection. 42.21 Section 42.21 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.21 Data collection. (a) For each protective statute, ESA... completed based on complaints. (g) The National Committee shall review the data collection systems of...

  3. Hindi Digits Recognition System on Speech Data Collected in Different Natural Noise Environments

    Directory of Open Access Journals (Sweden)

    Babita Saxena

    2015-02-01

    Full Text Available This paper presents a baseline digits speech recogn izer for Hindi language. The recording environment is different for all speakers, since th e data is collected in their respective homes. The different environment refers to vehicle horn no ises in some road facing rooms, internal background noises in some rooms like opening doors, silence in some rooms etc. All these recordings are used for training acoustic model. Th e Acoustic Model is trained on 8 speakers’ audio data. The vocabulary size of the recognizer i s 10 words. HTK toolkit is used for building acoustic model and evaluating the recognition rate of the recognizer. The efficiency of the recognizer developed on recorded data, is shown at the end of the paper and possible directions for future research work are suggested.

  4. Automatic remote correcting system for MOOCS

    OpenAIRE

    Rochat, Pierre-Yves

    2014-01-01

    An automatic correcting system was designed to be able to correct the programming exercises during a Massive Open Online Course (MOOC) about Microcontrollers, followed by thousands of students. Build around the MSP430G Launchpad, it has corrected more then 30'000 submissions in 7 weeks. This document provides general information about the system, the results obtained during a MOOC on the Coursera.org plateform, extensions done to remote experiences and future projects.

  5. Visual Algorithm Simulation Exercise System with Automatic Assessment: TRAKLA2

    Directory of Open Access Journals (Sweden)

    Lauri MALMI

    2004-10-01

    Full Text Available Interaction and feedback are key factors supporting the learning process. Therefore many automatic assessment and feedback systems have been developed for computer science courses during the past decade. In this paper we present a new framework, TRAKLA2, for building interactive algorithm simulation exercises. Exercises constructed in TRAKLA2 are viewed as learning objects in which students manipulate conceptual visualizations of data structures in order to simulate the working of given algorithms. The framework supports randomized input values for the assignments, as well as automatic feedback and grading of students' simulation sequences. Moreover, it supports automatic generation of model solutions as algorithm animations and the logging of statistical data about the interaction process resulting as students solve exercises. The system has been used in two universities in Finland for several courses involving over 1000 students. Student response has been very positive.

  6. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  7. [Automatic analysis pipeline of next-generation sequencing data].

    Science.gov (United States)

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  8. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple...... trials of cross-validation. Second, we test the robustness of each system to spectral equalization. Finally, we test how well human subjects recognize the genres of music excerpts composed by each system to be highly genre representative. Our results suggest that neither high-performing system has...... a capacity to recognize music genre....

  9. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  10. Gamma-ray spectrometry data collection and reduction by simple computing systems.

    Science.gov (United States)

    Op de Beeck, J

    1975-12-01

    The review summarizes the present state of the involvement of relatively small computing devices in the collection and processing of gamma-ray spectrum data. An economic and utilitarian point of view has been chosen with regard to data collection in order to arrive at practically valuable conclusions in terms of feasibility of possible configurations with respect to their eventual application. A unified point of view has been adopted with regard to data processing by developing an information theoretical approach on a more or less intuitive level in an attempt to remove the largest part of the virtual disparity between the several processing methods described in the literature. A synoptical introduction to the most important mathematical methods has been incorporated, together with a detailed theoretical description of the concept gamma-ray spectrum. In accordance with modern requirements, the discussions are mainly oriented towards high-resolution semiconductor detector-type spectra. The critical evaluation of the processing methods reviewed is done with respect to a set of predefined criteria. Smoothing, peak detection, peak intensity determination, overlapping peak resolving and detection and upper limits are discussed in great detail. A preferred spectrum analysis method combining powerful data reduction properties with extreme simplicity and speed of operation is suggested. The general discussion is heavily oriented towards activation analysis application, but other disciplines making use of gamma-ray spectrometry will find the material presented equally useful. Final conclusions are given pointing to future developments and shifting their centre of gravity towards improving the quality of the measurements rather than expanding the use of tedious and sophisticated mathematical techniques requiring the limits of available computational power.

  11. Automatic acquisition and classification system for agricultural network information based on Web data%基于Web数据的农业网络信息自动采集与分类系统

    Institute of Scientific and Technical Information of China (English)

    段青玲; 魏芳芳; 张磊; 肖晓琰

    2016-01-01

    The purpose of this study is to obtain agricultural web information efficiently, and to provide users with personalized service through the integration of agricultural resources scattered in different sites and the fusion of heterogeneous environmental data. The research in this paper has improved some key information technologies, which are agricultural web data acquisition and extraction technologies, text classification based on support vector machine (SVM) and heterogeneous data collection based on the Internet of things (IOT). We first add quality target seed site into the system, and get website URL (uniform resource locator) and category information. The web crawler program can save original pages. The de-noised web page can be obtained through HTML parser and regular expressions, which create custom Node Filter objects. Therefore, the system builds a document object model (DOM) tree before digging out data area. According to filtering rules, the target data area can be identified from a plurality of data regions with repeated patterns. Next, the structured data can be extracted after property segmentation. Secondly, we construct linear SVM classification model, and realize agricultural text classification automatically. The procedures of our model include 4 steps. First of all, we use segment tool ICTCLAS to carry out the word segment and part-of-speech (POS) tagging, followed by combining agricultural key dictionary and document frequency adjustment rule to choose feature words, and building a feature vector and calculating inverse document frequency (IDF) weight value for feature words; lastly we design adaptive classifier of SVM algorithm. Finally, the perception data of different format collected by the sensor are transmitted to the designated server as the source data through the wireless sensor network. Relational database in accordance with specified acquisition frequency can be achieved through data conversion and data filtering. The key step of

  12. Solar energy collection system

    Science.gov (United States)

    Selcuk, M. K. (Inventor)

    1977-01-01

    An improved solar energy collection system, having enhanced energy collection and conversion capabilities, is delineated. The system is characterized by a plurality of receivers suspended above a heliostat field comprising a multiplicity of reflector surfaces, each being adapted to direct a concentrated beam of solar energy to illuminate a target surface for a given receiver. A magnitude of efficiency, suitable for effectively competing with systems employed in collecting and converting energy extracted from fossil fuels, is indicated.

  13. Automatic programmers for solid set sprinkler irrigation systems

    OpenAIRE

    Zapata Ruiz, Nery; Salvador Esteban, Raquel; Cavero Campo, José; Lecina Brau, Sergio; Playán Jubillar, Enrique

    2012-01-01

    Introduction: The application of new technologies to the control and automation of irrigation processes is becoming very important in the last decade. Although automation of irrigation execution (irrigation programmers) is now widespread the automatic generation and execution of irrigation schedules is receiving growing attention due to the possibilities offered by the telemetry / remote control systems currently being installed in collective pressurized networks. In this paper, a protot...

  14. Automatic data processing of nondestructive testing results

    International Nuclear Information System (INIS)

    The ADP system for the documentation of inservice inspection results of nuclear power plants is described. The same system can be used during the whole operational life time of the plant. To make this possible the ADP system has to be independent of the type of hardware, data recording and software. The computer programs are made using Fortran IV programming language. The results of nondestructive testing are recorded in an inspection register by ADP methods. Different outputs can be utilized for planning, performance and reporting of inservice inspections. (author)

  15. Automatic TLI recognition system, user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report describes how to use an automatic target recognition system (version 14). In separate volumes are a general description of the ATR system, Automatic TLI Recognition System, General Description, and a programmer`s manual, Automatic TLI Recognition System, Programmer`s Guide.

  16. Challenges in Developing Data Collection Systems in a Rapidly Evolving Higher Education Environment

    Science.gov (United States)

    Borden, Victor M. H.; Calderon, Angel; Fourie, Neels; Lepori, Benedetto; Bonaccorsi, Andrea

    2013-01-01

    Efforts to develop common higher education data standards are expanding both within and across countries. The institutional research (IR) community plays a critical role in assuring that these efforts capture the diverse manifestations of the postsecondary and tertiary education systems and promote responsible comparisons. This chapter provides…

  17. Reliability data collection in the UK: the NCSR scheme

    International Nuclear Information System (INIS)

    Some of the general aspects of reliability data collection are discussed and illustrated with an example of a recent Safety and Reliability Systems (SRS) project concerned with high economic risk in the field of advanced manufacturing technology. Any data collection scheme must aim to provide the information required to enable the correct decisions to be taken in order to reach specified objectives. These objectives should be well defined and documented at the outset. In the current context they are improvements in the reliability of complex technical systems to improve economy of operation and/or safety. It may be for existing plant or for proposed systems in the design stage. The case study chosen contained examples of plant equipment in fuel reprocessing and active handling lines, for example robots, computer controlled cutting tools and automatic guided vehicles. New classification methods and the integration of several different computer applications packages were needed. (author)

  18. Ground-penetrating radar and differential global positioning system data collected from Long Beach Island, New Jersey, April 2015

    Science.gov (United States)

    Zaremba, Nicholas J.; Smith, Kathryn E.L.; Bishop, James M.; Smith, Christopher G.

    2016-08-04

    Scientists from the United States Geological Survey, St. Petersburg Coastal and Marine Science Center, U.S. Geological Survey Pacific Coastal and Marine Science Center, and students from the University of Hawaii at Manoa collected sediment cores, sediment surface grab samples, ground-penetrating radar (GPR) and Differential Global Positioning System (DGPS) data from within the Edwin B. Forsythe National Wildlife Refuge–Holgate Unit located on the southern end of Long Beach Island, New Jersey, in April 2015 (FAN 2015-611-FA). The study’s objective was to identify washover deposits in the stratigraphic record to aid in understanding barrier island evolution. This report is an archive of GPR and DGPS data collected from Long Beach Island in 2015. Data products, including raw GPR and processed DGPS data, elevation corrected GPR profiles, and accompanying Federal Geographic Data Committee metadata can be downloaded from the Data Downloads page.

  19. Should mortality data for the elderly be collected routinely in emergencies? The practical challenges of age-disaggregated surveillance systems.

    Science.gov (United States)

    du Cros, Philipp; Venis, Sarah; Karunakara, Unni

    2013-11-01

    Data on the elderly are rarely collected in humanitarian emergencies. During a refugee crisis in South Sudan, Médecins Sans Frontières developed a prospective mortality surveillance system collecting data for those aged ≥50 years and found that the elderly were dying at five times the rate of those aged 5-49 years. Practical and ethical issues arose. Were reported ages accurate? Since no baseline exists, what does the mortality rate mean? Should programmatic changes be made without evidence that these would reduce the elderly mortality rate? We outline issues to be addressed to enable informed decisions on response to elderly populations in emergency settings. PMID:24114674

  20. A Framework for Detecting Fraudulent Activities in EDO State Tax Collection System Using Investigative Data Mining

    Directory of Open Access Journals (Sweden)

    Okoro F. M

    2016-05-01

    Full Text Available Edo State Inland Revenue Services is overwhelmed with gigabyte of disk capacity containing data about tax payers’ in the state. The data stored on the database increases in size at an alarming rate. This has resulted in a data rich but information poor situation where there is a widening gap between the explosive growth of data and its types, and the ability to analyze and interpret it effectively; hence the need for a new generation of automated and intelligent tools and techniques known as investigative data mining, to look for patterns in data. These patterns can lead to new insights, competitive advantages for business, and tangible benefits for the State Revenue services. This research work focuses on designing effective fraud detection and deterring architecture using investigative data mining technique. The proposed system architecture is designed to reason using Artificial Neural Network and Machine learning algorithm in order to detect and deter fraudulent activities. We recommend that the architectural framework be developed using Object Oriented Programming and Agent Oriented Programming Languages.

  1. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    Science.gov (United States)

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  2. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  3. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and...

  4. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and...

  5. Kerman Photovoltaic Power Plant R&D data collection computer system operations and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, P.B.

    1994-06-01

    The Supervisory Control and Data Acquisition (SCADA) system at the Kerman PV Plant monitors 52 analog, 44 status, 13 control, and 4 accumulator data points in real-time. A Remote Terminal Unit (RTU) polls 7 peripheral data acquisition units that are distributed throughout the plant once every second, and stores all analog, status, and accumulator points that have changed since the last scan. The R&D Computer, which is connected to the SCADA RTU via a RS-232 serial link, polls the RTU once every 5-7 seconds and records any values that have changed since the last scan. A SCADA software package called RealFlex runs on the R&D computer and stores all updated data values taken from the RTU, along with a time-stamp for each, in a historical real-time database. From this database, averages of all analog data points and snapshots of all status points are generated every 10 minutes and appended to a daily file. These files are downloaded via modem by PVUSA/Davis staff every day, and the data is placed into the PVUSA database.

  6. An Automatic Indirect Immunofluorescence Cell Segmentation System

    Directory of Open Access Journals (Sweden)

    Yung-Kuan Chan

    2014-01-01

    Full Text Available Indirect immunofluorescence (IIF with HEp-2 cells has been used for the detection of antinuclear autoantibodies (ANA in systemic autoimmune diseases. The ANA testing allows us to scan a broad range of autoantibody entities and to describe them by distinct fluorescence patterns. Automatic inspection for fluorescence patterns in an IIF image can assist physicians, without relevant experience, in making correct diagnosis. How to segment the cells from an IIF image is essential in developing an automatic inspection system for ANA testing. This paper focuses on the cell detection and segmentation; an efficient method is proposed for automatically detecting the cells with fluorescence pattern in an IIF image. Cell culture is a process in which cells grow under control. Cell counting technology plays an important role in measuring the cell density in a culture tank. Moreover, assessing medium suitability, determining population doubling times, and monitoring cell growth in cultures all require a means of quantifying cell population. The proposed method also can be used to count the cells from an image taken under a fluorescence microscope.

  7. Recent advances in automatic alignment system for the National Iginition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, K; Awwal, A; Kalantar, D; Leach, R; Lowe-Webb, R; McGuigan, D; Kamm, V

    2010-12-08

    The automatic alignment system for the National Ignition Facility (NIF) is a large-scale parallel system that directs all 192 laser beams along the 300-m optical path to a 50-micron focus at target chamber in less than 50 minutes. The system automatically commands 9,000 stepping motors to adjust mirrors and other optics based upon images acquired from high-resolution digital cameras viewing beams at various locations. Forty-five control loops per beamline request image processing services running on a LINUX cluster to analyze these images of the beams and references, and automaticallys teer the beams toward the target. This paper discusses the upgrades to the NIF automatic alignment system to handle new alignment needs and evolving requirements as related to various types of experiments performed. As NIF becomes a continuously-operated system and more experiments are performed, performance monitoring is increasingly important for maintenance and commissioning work. Data, collected during operations, is analyzed for tuning of the laser and targeting maintenance work. handling evolving alignment and maintenance needs is expected for the planned 30-year operational life of NIF.

  8. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  9. Towards Automatic Capturing of Manual Data Processing Provenance

    OpenAIRE

    Wombacher, Andreas; Huq, Mohammad R.

    2011-01-01

    Often data processing is not implemented by a work ow system or an integration application but is performed manually by humans along the lines of a more or less specified procedure. Collecting provenance information during manual data processing can not be automated. Further, manual collection of provenance information is error prone and time consuming. Therefore, we propose to infer provenance information based on the read and write access of users. The derived provenance information is comp...

  10. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    events. Due to great variation in events, this method often fails to detect biologically relevant pressure variations. We have tried to develop a new concept for recognition of pressure events based on a neural network. Pressures were recorded for over 23 hours in 29 normal volunteers by means...... comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....... of a portable data recording system. A number of pressure events and non-events were selected from 9 recordings and used for training the network. The performance of the trained network was then verified on recordings from the remaining 20 volunteers. The accuracy and sensitivity of the two systems were...

  11. 实时采集中异常值的自动甄别与纠错方法研究%Study of method of automatically abnormalvalue discrimination and error correction in real-time data acquisition system

    Institute of Scientific and Technical Information of China (English)

    王永国; 胡娇娇; 季学枫

    2013-01-01

      随着科学技术进步,规模猪的生长过程研究手段也日益现代化。传统上对猪的生长研究大多采用人工收集数据,不仅麻烦费事,而且极易产生猪的应激反应,对猪的生长产生影响。随着各种传感器和现代通讯技术在养猪事业中的应用,数据的收集变得更加科学、方便。然而,由于采集数据对象的特殊性,使得在采集数据的过程中,客观存在挤、拱、撞等现象,从而造成采集的数据存在偏差,对后期分析研究猪的生长性状产生一定影响,因此,必需加以修正。鉴于此提出了将经典算法与神经网络方法相结合来自动甄别与纠正采集的数据,通过 Matlab 仿真及在安徽菩提果公司研发的“9SC-05猪用选种选料自动测定设备系统”的应用实践,表明该方法具有纠错准确率高、速度快、适应性好等优点。%With the progress of science and technology, the research method of scale pig’s growth process is also increasingly modern. Traditionally, manual methods of data collection are mostly used to study the pig’s growth, which are not only trouble-some, but also easily cause pig’s stress reaction, thus affect the normal growth of pig. With the application of various sensors and modern communication technology in pig industry, the data collection methods are becoming more and more scientific and convenient. However, due to the particularity of collect objects, there objectively exist all kinds of phenomenons, such as crowd-ing, arching and colliding, in the process of collecting data, which may make the data misaligned and subsequent analysis and research in growth characteristics of pig influenced, therefore, the data collected must be corrected. So, this paper puts forward the classical algorithm combined with neural network to automatically identify and correct the data collected. Through the simu-lation of Matlab and Anhui bodhi fruits company

  12. Intelligent information retrieval system using automatic thesaurus construction

    Science.gov (United States)

    Song, Wei; Yang, Jucheng; Li, Chenghua; Park, Sooncheol

    2011-05-01

    This paper presents an intelligent information retrieval (IR) system based on automatic thesaurus construction for its applications of document clustering and classification. These two applications are the most influential and widely used fields amongst the IR research community. We apply two biologically inspired algorithms, i.e. genetic algorithm (GA) and neural network (NN), to these two fields. A fuzzy logic controller GA and an adaptive back-propagation NN are proposed in our study, which can validly overcome the problems existing in their archetypes, e.g. slow evolution and being prone to trap into a local optimum. Furthermore, a well-constructed thesaurus has been recognised as a valuable tool in the effective operation of clustering and classification. It solves the problem in document representation organised by a bag of words, where some important relationships between words, e.g. synonymy and polysemy, are ignored. To investigate how our IR system could be used effectively, we conduct experiments on four data sets from the benchmark Reuter-21578 document collection and 20-newsgroup corpus. The results reveal that our IR system enhances the performance in comparison with k-means, common GA, and conventional back-propagation NN.

  13. Introduction to monitoring dynamic environmental phenomena of the world using satellite data collection systems, 1978

    Science.gov (United States)

    Carter, William Douglas; Paulson, R.W.

    1979-01-01

    The rapid development of satellite technology, especially in the area of radio transmission and imaging systems, makes it possible to monitor dynamic surface phenomena of the Earth in considerable detail. The monitoring systems that have been developed are compatible with standard monitoring systems such as snow, stream, and rain gages; wind, temperature and humidity measuring instruments; tiltmeters and seismic event counters. Supported by appropriate power, radios and antennae, remote stations can be left unattended for at least 1 year and consistently relay local information via polar orbiting or geostationary satellites. These data, in conjunction with timely Landsat images, can provide a basis for more accurate estimates on snowfall, water runoff, reservoir level changes, flooding, drought effects, and vegetation trends and may be of help in forecasting volcanic eruptions. These types of information are critical for resource inventory and development, especially in developing countries where remote regions are commonly difficult to access. This paper introduces the reader to the systems available, describes their features and limitations, and provides suggestions on how to employ them. An extensive bibliography is provided for those who wish more information.

  14. The house keeping data acquisition system in space detection

    International Nuclear Information System (INIS)

    The house keeping data acquisition system in space detection is introduced. It is based on micro-controller 80C196. The system can automatically collect the data such as the temperature, high voltage power supply and the events counter, and communication data with 1553B bus interface

  15. Automatic focusing system of BSST in Antarctic

    Science.gov (United States)

    Tang, Peng-Yi; Liu, Jia-Jing; Zhang, Guang-yu; Wang, Jian

    2015-10-01

    Automatic focusing (AF) technology plays an important role in modern astronomical telescopes. Based on the focusing requirement of BSST (Bright Star Survey Telescope) in Antarctic, an AF system is set up. In this design, functions in OpenCV is used to find stars, the algorithm of area, HFD or FWHM are used to degree the focus metric by choosing. Curve fitting method is used to find focus position as the method of camera moving. All these design are suitable for unattended small telescope.

  16. Automatic Battery Swap System for Home Robots

    OpenAIRE

    Juan Wu; Guifang Qiao; Jian Ge; Hongtao Sun; Guangming Song

    2012-01-01

    This paper presents the design and implementation of an automatic battery swap system for the prolonged activities of home robots. A battery swap station is proposed to implement battery off‐line recharging and on‐line exchanging functions. It consists of a loading and unloading mechanism, a shifting mechanism, a locking device and a shell. The home robot is a palm‐sized wheeled robot with an onboard camera and a removable battery case in the front. It communicates with the battery swap stati...

  17. Automatic system for detecting pornographic images

    Science.gov (United States)

    Ho, Kevin I. C.; Chen, Tung-Shou; Ho, Jun-Der

    2002-09-01

    Due to the dramatic growth of network and multimedia technology, people can more easily get variant information by using Internet. Unfortunately, it also makes the diffusion of illegal and harmful content much easier. So, it becomes an important topic for the Internet society to protect and safeguard Internet users from these content that may be encountered while surfing on the Net, especially children. Among these content, porno graphs cause more serious harm. Therefore, in this study, we propose an automatic system to detect still colour porno graphs. Starting from this result, we plan to develop an automatic system to search porno graphs or to filter porno graphs. Almost all the porno graphs possess one common characteristic that is the ratio of the size of skin region and non-skin region is high. Based on this characteristic, our system first converts the colour space from RGB colour space to HSV colour space so as to segment all the possible skin-colour regions from scene background. We also apply the texture analysis on the selected skin-colour regions to separate the skin regions from non-skin regions. Then, we try to group the adjacent pixels located in skin regions. If the ratio is over a given threshold, we can tell if the given image is a possible porno graph. Based on our experiment, less than 10% of non-porno graphs are classified as pornography, and over 80% of the most harmful porno graphs are classified correctly.

  18. Automatic Railway Power Line Extraction Using Mobile Laser Scanning Data

    Science.gov (United States)

    Zhang, Shanxin; Wang, Cheng; Yang, Zhuang; Chen, Yiping; Li, Jonathan

    2016-06-01

    Research on power line extraction technology using mobile laser point clouds has important practical significance on railway power lines patrol work. In this paper, we presents a new method for automatic extracting railway power line from MLS (Mobile Laser Scanning) data. Firstly, according to the spatial structure characteristics of power-line and trajectory, the significant data is segmented piecewise. Then, use the self-adaptive space region growing method to extract power lines parallel with rails. Finally use PCA (Principal Components Analysis) combine with information entropy theory method to judge a section of the power line whether is junction or not and which type of junction it belongs to. The least squares fitting algorithm is introduced to model the power line. An evaluation of the proposed method over a complicated railway point clouds acquired by a RIEGL VMX450 MLS system shows that the proposed method is promising.

  19. Web Service Interface for Data Collection

    Directory of Open Access Journals (Sweden)

    Ruchika

    2012-05-01

    Full Text Available Data collection is a key component of an information system. The widespread penetration of ICT tools in organizations and institutions has resulted in a shift in the way the data is collected. Data may be collected in printed-form, by e-mails, on a compact disk, or, by direct upload on the management information system. Since web services are platform-independent, it can access data stored in the XML format from any platform. In this paper, we present an interface which uses web services for data collection. It requires interaction between a web service deployed for the purposes of data collection, and the web address where the data is stored. Our interface requires that the web service has pre-knowledge of the address from where the data is to be collected. Also, the data to be accessed must be stored in XML format. Since our interface uses computer-supported interaction on both sides, it eases the task of regular and ongoing data collection. We apply our framework to the Education Management Information System, which collects data from schools spread across the country.

  20. Development of optical automatic positioning and wafer defect detection system

    International Nuclear Information System (INIS)

    The data of a wafer with defects can provide engineers with very important information and clues to improve the yield rate and quality in manufacturing. This paper presents a microscope automatic positioning and wafer detection system with human-machine interface based on image processing and fuzzy inference algorithms. In the proposed system, a XY table is used to move the position of each die on 6 inch or 8 inch wafers. Then, a high-resolution CCD and one set of two-axis optical linear encoder are used to accurately measure the position on the wafer. Finally, the developed human-machine interface is used to display the current position of an actual wafer in order to complete automatic positioning, and a wafer map database can be created. In the process of defect detection, CCD is used for image processing, and during preprocessing, it is required to filter noise, acquire the defect characteristics, define the defective template, and then take the characteristic points of the defective template as the reference input for fuzzy inference. A high-accuracy optical automatic positioning and wafer defect detection system is thus constructed. This study focused on automatic detection of spots, scratches, and bruises, and attempted to reduce the time to detect defective die and improve the accuracy of determining the defects of semiconductor devices. (paper)

  1. Application of Data Transmission Technology in the Automatic Fire Alarm System%数据通信传输技术在火灾自动报警系统中的运用

    Institute of Scientific and Technical Information of China (English)

    梁鸿

    2011-01-01

    At present the fire damage degree and endanger scope are on the rise. In order to reduce the loss of degree, countnes in the world develop automatic fire alarm system. Data transmission technology is a very important technique in the automatic fire alarm system. Combining with automatic fire alarm system from data transmission mode. string/parallel communication mode. flyers/duplex data transmission. data synchronization method, digital coding method, the article introduced the technology.%目前,火灾损失程度和危害范围呈上升趋势.为了降低火灾损失的程度,世界各国都在积极开发火灾自动报警系统.数据通信传输技术是在火灾自动报警系统中非常重要的一种技术.本文结合火灾自动报警系统从数据通信传输模式,串/并行通信方式、传单/双工数据传输方式、数据同步方式、数字编码方式等方面介绍这一技术.

  2. Water Column Sonar Data Collection

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The collection and analysis of water column sonar data is a relatively new avenue of research into the marine environment. Primary uses include assessing biological...

  3. Distributed privacy preserving data collection

    KAUST Repository

    Xue, Mingqiang

    2011-01-01

    We study the distributed privacy preserving data collection problem: an untrusted data collector (e.g., a medical research institute) wishes to collect data (e.g., medical records) from a group of respondents (e.g., patients). Each respondent owns a multi-attributed record which contains both non-sensitive (e.g., quasi-identifiers) and sensitive information (e.g., a particular disease), and submits it to the data collector. Assuming T is the table formed by all the respondent data records, we say that the data collection process is privacy preserving if it allows the data collector to obtain a k-anonymized or l-diversified version of T without revealing the original records to the adversary. We propose a distributed data collection protocol that outputs an anonymized table by generalization of quasi-identifier attributes. The protocol employs cryptographic techniques such as homomorphic encryption, private information retrieval and secure multiparty computation to ensure the privacy goal in the process of data collection. Meanwhile, the protocol is designed to leak limited but non-critical information to achieve practicability and efficiency. Experiments show that the utility of the anonymized table derived by our protocol is in par with the utility achieved by traditional anonymization techniques. © 2011 Springer-Verlag.

  4. 24 CFR 902.60 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Data collection. 902.60 Section 902.60 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.60 Data collection. (a) Fiscal Year reporting...

  5. 20 CFR 653.109 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Data collection. 653.109 Section 653.109 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR SERVICES OF THE EMPLOYMENT SERVICE SYSTEM Services for Migrant and Seasonal Farmworkers (MSFWs) § 653.109 Data collection....

  6. 34 CFR 303.540 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.540 Section 303.540 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... DISABILITIES State Administration Reporting Requirements § 303.540 Data collection. (a) Each system...

  7. Modeling of a data exchange process in the Automatic Process Control System on the base of the universal SCADA-system

    Science.gov (United States)

    Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.

    2016-04-01

    In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.

  8. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  9. Automatic data acquisition of anthropological measurements

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O

    1993-01-01

    A computer program in BASIC is presented which enables the input of measurement data from a caliper directly into specific records in a dBASE IV or PARADOX database. The program circumvents the tedious procedure of first recording measurement data manually and then entering the data into a comput...

  10. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination

  11. Time Synchronization Module for Automatic Identification System

    Institute of Scientific and Technical Information of China (English)

    Choi Il-heung; Oh Sang-heon; Choi Dae-soo; Park Chan-sik; Hwang Dong-hwan; Lee Sang-jeong

    2003-01-01

    This paper proposed a design and implementation procedure of the Time Synchronization Module (TSM) for the Automatic Identification System (AIS). The proposed TSM module uses a Temperature Compensated Crystal Oscillator (TCXO) as a local reference clock, and consists of a Digitally Controlled Oscillator (DCO), a divider, a phase discriminator, and register blocks. The TSM measures time difference between the 1 PPS from the Global Navigation Satellite System (GNSS) receiver and the generated transmitter clock. The measured time difference is compensated by controlling the DCO and the transmit clock is synchronized to the Universal Time Coordinated (UTC). The designed TSM can also be synchronized to the reference time derived from the received message. The proposed module is tested using the experimental AIS transponder set. The experimental results show that the proposed module satisfies the functional and timing specification of the AIS technical standard, ITU-R M.1371.

  12. Automatic TLI recognition system, programmer`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report describes the software of an automatic target recognition system (version 14), from a programmer`s point of view. The intent is to provide information that will help people who wish to modify the software. In separate volumes are a general description of the ATR system, Automatic TLI Recognition System, General Description, and a user`s manual, Automatic TLI Recognition System, User`s Guide. 2 refs.

  13. Wastewater Collection Systems.

    Science.gov (United States)

    Vallabhaneni, Srinivas

    2016-10-01

    This chapter presents a review of the literature published in 2015 on topics relating to wastewater collection systems. It presents noteworthy advances in research and industry experiences selected from major literature sources. This review is divided into the following sections: sewer system planning; sewer condition assessment/rehabilitation; pump stations/force mains/ system design; operation and maintenance; asset management; and regulatory issues/ integrated planning. PMID:27620080

  14. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  15. Quality assurance for screening mammography data collection systems in 22 countries.

    NARCIS (Netherlands)

    Klabunde, C.N.; Sancho-Garnier, H.; Broeders, M.E.A.C.; Thoresen, S.; Rodrigues, V.J.; Ballard-Barbash, R.

    2001-01-01

    OBJECTIVES: To document the mammography data that are gathered by the organized screening programs participating in the International Breast Cancer Screening Network (IBSN), the nature of their procedures for data quality assurance, and the measures used to assess program performance and impact. MET

  16. 76 FR 58301 - Proposed Extension of Existing Information Collection; Automatic Fire Sensor and Warning Device...

    Science.gov (United States)

    2011-09-20

    ... Sensor and Warning Device Systems; Examination and Test Requirements ACTION: Notice of request for public... Coal Mining. OMB 1219-0145 has been renamed Automatic Fire Sensor and Warning Device Systems... to a task in July 2011; OMB 1219-0073 subsumed Sec. 75.1103-5(a)(2)(ii) Automatic fire sensor...

  17. Automatic Laser Interferometer And Vision Measurement System For Stripe Rod Calibration

    Directory of Open Access Journals (Sweden)

    Zhao Min

    2015-12-01

    Full Text Available In order to calibrate the stripe precision of a leveling rod, an automatic laser interferometer and a vision measurement system were designed by Xi’an University of Technology in China. The rod was driven by a closed-loop control and the data were collected at the stop state to ensure precision. The laser interferometer provided not only the long distance data but also a position feedback signal in the automatic control loop. CCD camera and a vision measurement method were used to inspect the stripe edge position and defect. A pixel-equivalent self-calibration method was designed to improve precision. ROI (regions of interest method and an outline tracing method were designed to quickly extract multiple stripe edges. A combination of the image data with the interferometer data reduces control difficulty and ensures the measurement accuracy. The vision measurement method reached sub-pixel precision and the defective edges were reported. The system can automatically calibrate a stripe leveling rod with a high degree of efficiency and precision.

  18. AUTOMATICALLY CONVERTING TABULAR DATA TO RDF: AN ONTOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Kumar Sharma

    2015-07-01

    Full Text Available Information residing in relational databases and delimited file systems are inadequate for reuse and sharing over the web. These file systems do not adhere to commonly set principles for maintaining data harmony. Due to these reasons, the resources have been suffering from lack of uniformity, heterogeneity as well as redundancy throughout the web. Ontologies have been widely used for solving such type of problems, as they help in extracting knowledge out of any information system. In this article, we focus on extracting concepts and their relations from a set of CSV files. These files are served as individual concepts and grouped into a particular domain, called the domain ontology. Furthermore, this domain ontology is used for capturing CSV data and represented in RDF format retaining links among files or concepts. Datatype and object properties are automatically detected from header fields. This reduces the task of user involvement in generating mapping files. The detail analysis has been performed on Baseball tabular data and the result shows a rich set of semantic information

  19. Data Collection and New Technology

    Directory of Open Access Journals (Sweden)

    Olubunmi Philip Aborisade

    2013-05-01

    Full Text Available Interview has become a popular method of data collection in qualitative research. This article examines the different interview methods for collecting data (e.g., structured interviews, group interviews, unstructured, etc., as well as the various methods for analyzing interview data (e.g., interpretivism, social anthropology, collaborative social research. It also evaluates the interview types and analysis methods in qualitative research and the new technology for conducting interviews such as e-mail, telephone, skype, webcam, Facebook chat etc to ascertain how they limit interviewees from giving full picture of moral and ethical Issues.

  20. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  1. Data collection system. Volume 1, Overview and operators manual; Volume 2, Maintenance manual; Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, R.B.; Bauder, M.E.; Boyer, W.B.; French, R.E.; Isidoro, R.J.; Kaestner, P.C.; Perkins, W.G.

    1993-09-01

    Sandia National Laboratories (SNL) Instrumentation Development Department was tasked by the Defense Nuclear Agency (DNA) to record data on Tektronix RTD720 Digitizers on the HUNTERS TROPHY field test conducted at the Nevada Test Site (NTS) on September 18, 1992. This report contains a overview and description of the computer hardware and software that was used to acquire, reduce, and display the data. The document is divided into two volumes: an overview and operators manual (Volume 1) and a maintenance manual (Volume 2).

  2. Automatic data distribution for massively parallel processors

    OpenAIRE

    García Almiñana, Jordi

    1997-01-01

    Massively Parallel Processor systems provide the required computational power to solve most large scale High Performance Computing applications. Machines with physically distributed memory allow a cost-effective way to achieve this performance, however, these systems are very diffcult to program and tune. In a distributed-memory organization each processor has direct access to its local memory, and indirect access to the remote memories of other processors. But the cost of accessing a local m...

  3. Evaluation of a Teleform-based data collection system: A multi-center obesity research case study

    Science.gov (United States)

    Jenkins, Todd M.; Boyce, Tawny Wilson; Akers, Rachel; Andringa, Jennifer; Liu, Yanhong; Miller, Rosemary; Powers, Carolyn; Buncher, C. Ralph

    2016-01-01

    Utilizing electronic data capture (EDC) systems in data collection and management allows automated validation programs to preemptively identify and correct data errors. For our multi-center, prospective study we chose to use TeleForm, a paper-based data capture software that uses recognition technology to create case report forms (CRFs) with similar functionality to EDC, including custom scripts to identify entry errors. We quantified the accuracy of the optimized system through a data audit of CRFs and the study database, examining selected critical variables for all subjects in the study, as well as an audit of all variables for 25 randomly selected subjects. Overall we found 6.7 errors per 10,000 fields, with similar estimates for critical (6.9/10,000) and non-critical (6.5/10,000) variables – values that fall below the acceptable quality threshold of 50 errors per 10,000 established by the Society for Clinical Data Management. However, error rates were found to widely vary by type of data field, with the highest rate observed with open text fields. PMID:24709056

  4. Advanced instrumentation for the collection, retrieval, and processing of urban stormwater data

    Science.gov (United States)

    Robinson, Jerald B.; Bales, Jerad D.; Young, Wendi S.; ,

    1995-01-01

    The U.S. Geological Survey, in cooperation with the City of Charlotte and Mecklenburg County, North Carolina, has developed a data-collection network that uses advanced instrumentation to automatically collect, retrieve, and process urban stormwater data. Precipitation measurement and water-quality networks provide data for (1) planned watershed simulation models, (2) early warning of possible flooding, (3) computation of material export, and (4) characterization of water quality in relation to basin conditions. Advantages of advanced instrumentation include remote access to real-time data, reduced demands on and more efficient use of limited human resources, and direct importation of data into a geographical information system for display and graphic analysis.

  5. An efficient automatic firearm identification system

    Science.gov (United States)

    Chuan, Zun Liang; Liong, Choong-Yeun; Jemain, Abdul Aziz; Ghani, Nor Azura Md.

    2014-06-01

    Automatic firearm identification system (AFIS) is highly demanded in forensic ballistics to replace the traditional approach which uses comparison microscope and is relatively complex and time consuming. Thus, several AFIS have been developed for commercial and testing purposes. However, those AFIS are still unable to overcome some of the drawbacks of the traditional firearm identification approach. The goal of this study is to introduce another efficient and effective AFIS. A total of 747 firing pin impression images captured from five different pistols of same make and model are used to evaluate the proposed AFIS. It was demonstrated that the proposed AFIS is capable of producing firearm identification accuracy rate of over 95.0% with an execution time of less than 0.35 seconds per image.

  6. Platform attitude data acquisition system

    Digital Repository Service at National Institute of Oceanography (India)

    Afzulpurkar, S.

    A system for automatic acquisition of underwater platform attitude data has been designed, developed and tested in the laboratory. This is a micro controller based system interfacing dual axis inclinometer, high-resolution digital compass...

  7. 40 CFR 141.533 - What data must my system collect to calculate a disinfection profile?

    Science.gov (United States)

    2010-07-01

    ... system uses chlorine, the pH of the disinfected water at each residual disinfectant concentration... calculate a disinfection profile? 141.533 Section 141.533 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS...

  8. Advanced Data Collection for Inventory Management

    Science.gov (United States)

    Opresko, G. A.; Leet, J. H.; Mcgrath, D. F.; Eidson, J.

    1987-01-01

    Bar-coding, radio-frequency, and voice-operated systems selected. Report discusses study of state-of-the-art in automated collection of data for management of large inventories. Study included comprehensive search of literature on data collection and inventory management, visits to existing automated inventory systems, and tours of selected supply and transportation facilities at Kennedy Space Center. Information collected analyzed in view of needs of conceptual inventory-management systems for Kennedy Space Center and for manned space station and other future space projects.

  9. Human-system Interfaces for Automatic Systems

    Energy Technology Data Exchange (ETDEWEB)

    OHara, J.M.; Higgins,J. (BNL); Fleger, S.; Barnes V. (NRC)

    2010-11-07

    Automation is ubiquitous in modern complex systems, and commercial nuclear- power plants are no exception. Automation is applied to a wide range of functions including monitoring and detection, situation assessment, response planning, and response implementation. Automation has become a 'team player' supporting personnel in nearly all aspects of system operation. In light of its increasing use and importance in new- and future-plants, guidance is needed to conduct safety reviews of the operator's interface with automation. The objective of this research was to develop such guidance. We first characterized the important HFE aspects of automation, including six dimensions: levels, functions, processes, modes, flexibility, and reliability. Next, we reviewed literature on the effects of all of these aspects of automation on human performance, and on the design of human-system interfaces (HSIs). Then, we used this technical basis established from the literature to identify general principles for human-automation interaction and to develop review guidelines. The guidelines consist of the following seven topics: automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, our study identified several topics for additional research.

  10. Automatic Generation of Symbolic Model for Parameterized Synchronous Systems

    Institute of Scientific and Technical Information of China (English)

    Wei-Wen Xu

    2004-01-01

    With the purpose of making the verification of parameterized system more general and easier, in this paper, a new and intuitive language PSL (Parameterized-system Specification Language) is proposed to specify a class of parameterized synchronous systems. From a PSL script, an automatic method is proposed to generate a constraint-based symbolic model. The model can concisely symbolically represent the collections of global states by counting the number of processes in a given state. Moreover, a theorem has been proved that there is a simulation relation between the original system and its symbolic model. Since the abstract and symbolic techniques are exploited in the symbolic model, state-explosion problem in traditional verification methods is efficiently avoided. Based on the proposed symbolic model, a reachability analysis procedure is implemented using ANSI C++ on UNIX platform. Thus, a complete tool for verifying the parameterized synchronous systems is obtained and tested for some cases. The experimental results show that the method is satisfactory.

  11. Automation of plasma-process fultext bibliography databases. An on-line data-collection, data-mining and data-input system

    International Nuclear Information System (INIS)

    Searching for relevant data, information retrieval, data extraction and data input are time- and resource-consuming activities in most data centers. Here we develop a Linux system automating the process in case of bibliography, abstract and fulltext databases. The present system is an open-source free-software low-cost solution that connects the target and provider databases in cyberspace through various web publishing formats. The abstract/fulltext relevance assessment is interfaced to external software modules. (author)

  12. Automatic Data for Applied Railway Management

    OpenAIRE

    Frumin, Michael; Zhao, Jinhua; Zhao, Zhan; Wilson, Nigel H. M.

    2013-01-01

    In 2009, London Overground management implemented a new tactical plan for a.m. and p.m. peak service on the North London Line (NLL). This paper documents that tactical planning intervention and evaluates its outcomes in terms of certain aspects of service delivery (the operator's perspective on system performance) and service quality (the passenger's perspective). Analyses of service delivery and quality and of passenger demand contributed to the development, proposal, and implementation of t...

  13. Reliability of the TJ-II Power Supply System: collection and analysis of the operational experience data

    International Nuclear Information System (INIS)

    An effort to develop a fusion specific component reliability database is being carried out by international organizations such as EURATOM and the International Energy Agency (IAE). Moreover, several fusion related devices are involved in the collection of operational experience. In this frame, TJ-II is performing a reliability assessment of its main systems initiating the evaluation process with the Power Supply System (PSS). The PSS has been chosen because it is one of the most critical systems in TJ-II. During a TJ-II pulse, the provision of magnetic fields requires a total amount of power of almost 50 MW. Such amount of power is supplied by the national grid and a 140 MVA flywheel generator (100 MJ, 15 kV 100 Hz). Since the TJ-II loads require direct current, a set of thyristor converters, transformers and circuit breakers for each coil system are being used. Power requirements range from 5 kA (100 V) to 32 kA (1000 V). Also the heating systems (ECRH and NBI) load the PSS with additional 15 MVA. Failure data of these main components and components from auxiliary systems (rectifiers cooling, uninterrupted power supply system, control system ...) have been collected from PSS operation notes and personnel interviewing. Data related to general operation, campaign schedules, and maintenance have been collected from TJ-II engineering annotations, TJ-II web based electronic-board and TJ-II campaign archives. A database with date, time, pulse number, failure description, failure mode and other related information has been implemented. Failures and malfunctions in the PSS have been identified and processed, including information on failure modes and, where possible, causes of the failures. About 1700 failures and malfunctions have been identified in the period May 1998 - Dec 2004 (1309 of them in operational days and 381 during tests or maintenance task). Most malfunctions come from spurious signals over the circuit breaker and from the generation system. Main

  14. QuaDoSta - a freely configurable system which facilitates multi-centric data collection for healthcare and medical research

    Directory of Open Access Journals (Sweden)

    Albrecht, Ulrike

    2007-07-01

    Full Text Available This article describes QuaDoSta (quality assurance, documentation and statistics, a flexible documentation system as well as a data collection and networking platform for medical facilities. The user can freely define the required documentation masks which are easily expandable and can be adapted to individual requirements without the need for additional programming. To avoid duplication, data transfer interfaces can be configured flexibly to external sources such as patient management systems used in surgeries or hospital information systems. The projects EvaMed (Evaluation Anthroposophical Medicine and the Network Oncology are two scientific research projects which have been successfully established as nationally active networks on the basis of QuaDoSta. The EvaMed-Network serves as a modern pharmacovigilance project for the documentation of adverse drug events. All prescription data are electronically recorded to assess the relative risk of drugs. The Network Oncology was set up as a documentation system in four hospitals and seven specialist oncology practices where a complete record of all oncological therapies is being carried out to uniform standards on the basis of the ‘basic documentation for tumour patients’ (BDT developed by the German Cancer Society. The QuaDoSta solution system made it possible to cater for the specific requirements of the presented projects. The following features of the system proved to be highly advantageous: flexible setup of catalogues and user friendly customisation and extensions, complete dissociation of system setup and documentation content, multi-centre networkability, and configurable data transfer interfaces.

  15. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  16. 14 CFR 171.267 - Glide path automatic monitor system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Glide path automatic monitor system. 171... Landing System (ISMLS) § 171.267 Glide path automatic monitor system. (a) The ISMLS glide path equipment... control points when any of the following occurs: (1) A shift of the mean ISMLS glide path angle...

  17. System for Automatic Generation of Examination Papers in Discrete Mathematics

    Science.gov (United States)

    Fridenfalk, Mikael

    2013-01-01

    A system was developed for automatic generation of problems and solutions for examinations in a university distance course in discrete mathematics and tested in a pilot experiment involving 200 students. Considering the success of such systems in the past, particularly including automatic assessment, it should not take long before such systems are…

  18. 2013 International Conference on Mechatronics and Automatic Control Systems

    CERN Document Server

    2014-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the 2013 International Conference on Mechatronics and Automatic Control Systems held in Hangzhou, China on August 10-11, 2013. .

  19. Global synthesis and critical evaluation of pharmaceutical data sets collected from river systems.

    Science.gov (United States)

    Hughes, Stephen R; Kay, Paul; Brown, Lee E

    2013-01-15

    Pharmaceuticals have emerged as a major group of environmental contaminants over the past decade but relatively little is known about their occurrence in freshwaters compared to other pollutants. We present a global-scale analysis of the presence of 203 pharmaceuticals across 41 countries and show that contamination is extensive due to widespread consumption and subsequent disposal to rivers. There are clear regional biases in current understanding with little work outside North America, Europe, and China, and no work within Africa. Within individual countries, research is biased around a small number of populated provinces/states and the majority of research effort has focused upon just 14 compounds. Most research has adopted sampling techniques that are unlikely to provide reliable and representative data. This analysis highlights locations where concentrations of antibiotics, cardiovascular drugs, painkillers, contrast media, and antiepileptic drugs have been recorded well above thresholds known to cause toxic effects in aquatic biota. Studies of pharmaceutical occurrence and effects need to be seen as a global research priority due to increasing consumption, particularly among societies with aging populations. Researchers in all fields of environmental management need to work together more effectively to identify high risk compounds, improve the reliability and coverage of future monitoring studies, and develop new mitigation measures.

  20. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Science.gov (United States)

    Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.

    2016-06-01

    Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  1. The BENTO Box: Development and field-testing of a new satellite-linked data collection system for multiparameter volcano monitoring

    Science.gov (United States)

    Roman, D. C.; Behar, A.; Elkins-Tanton, L. T.

    2014-12-01

    Predicting volcanic activity requires continuous monitoring for signals of magmatic unrest in harsh, often remote environments. BENTO is a next-generation monitoring system, currently in prototype testing, that is highly portable, low-cost, rapidly deployable, and entirely autonomous. Such a system could be used to provide critical monitoring and data collection capabilities during rapid-onset eruptions, or to provide a crude baseline monitor at large numbers of remote volcanoes to 'flag' the onset of unrest so that costlier resources such as specialized instrumentation can be deployed in the appropriate place at the appropriate time. The BENTO 1 (low-rate data) prototype currently comprises off-the-shelf volcanic gas sensors (SO2, CO2, Fl, Cl, and Br), a weather station (temperature, wind speed, wind direction, rainfall, humidity, pressure), and telemetry via Iridium modem. In baseline mode, BENTO 1 takes a measurement from all of its sensors every two hours and automatically sends the measurements through Iridium to a server that posts them to a dedicated and customizable web page. The measurement interval and other sensor parameters (pumping time, sensor constants) can be adjusted directly or remotely (through the Iridium network) as needed. Currently, BENTO 1 is deployed at Mt. Etna, Italy; Telica Volcano, Nicaragua, Hengill Volcano, Iceland; and Hekla Volcano, Iceland. The BENTO 2 (high-rate) system is motivated by a need to avoid having to telemeter raw seismic data, which at 20-100 Hz/channel is far too voluminous for cost- and power-effective transmission through satellite networks such as Iridium. Our solution is to regularly transmit only state-of-health information and descriptions of the seismic data (e.g., 'triggered' seismic event rates and amplitudes), rather than the data itself. The latter can be accomplished through on-board data analysis and reduction at the installation site. Currently, it is possible to request specific time segments of raw

  2. Automatic Battery Swap System for Home Robots

    Directory of Open Access Journals (Sweden)

    Juan Wu

    2012-12-01

    Full Text Available This paper presents the design and implementation of an automatic battery swap system for the prolonged activities of home robots. A battery swap station is proposed to implement battery off‐line recharging and on‐line exchanging functions. It consists of a loading and unloading mechanism, a shifting mechanism, a locking device and a shell. The home robot is a palm‐sized wheeled robot with an onboard camera and a removable battery case in the front. It communicates with the battery swap station wirelessly through ZigBee. The influences of battery case deflection and robot docking deflection on the battery swap operations have been investigated. The experimental results show that it takes an average time of 84.2s to complete the battery swap operations. The home robot does not have to wait several hours for the batteries to be fully charged. The proposed battery swap system is proved to be efficient in home robot applications that need the robots to work continuously over a long period.

  3. PLC Based Automatic Multistoried Car Parking System

    Directory of Open Access Journals (Sweden)

    Swanand S .Vaze

    2014-12-01

    Full Text Available This project work presents the study and design of PLC based Automatic Multistoried Car Parking System. Multistoried car parking is an arrangement which is used to park a large number of vehicles in least possible place. For making this arrangement in a real plan very high technological instruments are required. In this project a prototype of such a model is made. This prototype model is made for accommodating twelve cars at a time. Availability of the space for parking is detected by optical proximity sensor which is placed on the pallet. A motor controlled elevator is used to lift the cars. Elevator status is indicated by LED which is placed on ground floor. Controlling of the platforms and checking the vacancies is done by PLC. For unparking of car, keyboard is interfaced with the model for selection of required platform. Automation is done to reduce requirement of space and also to reduce human errors, which in-turn results in highest security and greatest flexibility. Due to these advantages, this system can be used in hotels, railway stations, airports where crowding of car is more.

  4. Meteorological observatory for Antarctic data collection

    International Nuclear Information System (INIS)

    In the last years, a great number of automatic weather stations was installed in Antarctica, with the aim to examine closely the weather and climate of this region and to improve the coverage of measuring points on the Antarctic surface. In 1987 the Italian Antarctic Project started to set up a meteorological network, in an area not completely covered by other countries. Some of the activities performed by the meteorological observatory, concerning technical functions such as maintenance of the AWS's and the execution of radio soundings, or relating to scientific purposes such as validation and elaboration of collected data, are exposed. Finally, some climatological considerations on the thermal behaviour of the Antarctic troposphere such as 'coreless winter', and on the wind field, including katabatic flows in North Victoria Land are described

  5. TFTR data management system

    Energy Technology Data Exchange (ETDEWEB)

    Randerson, L.; Chu, J.; Ludescher, C.; Malsbury, J.; Stark, W.

    1986-08-01

    Developments in the tokamak fusion test reactor (TFTR) data-management system supporting data acquisition and off-line physics data reduction are described. Data from monitor points, timing channels, transient recorder channels, and other devices are acquired and stored for use by on-line tasks. Files are transferred off line automatically. A configuration utility determines data acquired and files transferred. An event system driven by file arrival activates off-line reduction processes. A post-run process transfers files not shipped during runs. Files are archived to tape and are retrievable by digraph and shot number. Automatic skimming based on most recent access, file type, shot numbers, and user-set protections maintains the files required for post-run data reduction.

  6. Students Collecting Real time Data

    Science.gov (United States)

    Miller, P.

    2006-05-01

    Students Collecting Real-Time Data The Hawaiian Islands Humpback Whale National Marine Sanctuary has created opportunities for middle and high school students to become Student Researchers and to be involved in real-time marine data collection. It is important that we expose students to different fields of science and encourage them to enter scientific fields of study. The Humpback Whale Sanctuary has an education visitor center in Kihei, Maui. Located right on the beach, the site has become a living classroom facility. There is a traditional Hawaiian fishpond fronting the property. The fishpond wall is being restored, using traditional methods. The site has the incredible opportunity of incorporating Hawaiian cultural practices with scientific studies. The Sanctuary offers opportunities for students to get involved in monitoring and data collection studies. Invasive Seaweed Study: Students are collecting data on invasive seaweed for the University of Hawaii. They pull a large net through the shallow waters. Seaweed is sorted, identified and weighed. The invasive seaweeds are removed. The data is recorded and sent to UH. Remote controlled monitoring boats: The sanctuary has 6 boogie board sized remote controlled boats used to monitor reefs. Boats have a camera with lights on the underside. The boats have water quality monitoring devices and GPS units. The video from the underwater camera is transmitted via a wireless transmission. Students are able to monitor the fish, limu and invertebrate populations on the reef and collect water quality data via television monitors or computers. The boat can also pull a small plankton tow net. Data is being compiled into data bases. Artificial Reef Modules: The Sanctuary has a scientific permit from the state to build and deploy artificial reef modules. High school students are designing and building modules. These are deployed out in the Fishpond fronting the Sanctuary site and students are monitoring them on a weekly basis

  7. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  8. The automatic calibration of Korean VLBI Network data

    CERN Document Server

    Hodgson, Jeffrey A; Zhao, Guang-Yao; Algaba, Juan-Carlos; Yun, Youngjoo; Jung, Taehyun; Byun, Do-Young

    2016-01-01

    The calibration of Very Long Baseline Interferometry (VLBI) data has long been a time consuming process. The Korean VLBI Network (KVN) is a simple array consisting of three identical antennas. Because four frequencies are observed simultaneously, phase solutions can be transferred from lower frequencies to higher frequencies in order to improve phase coherence and hence sensitivity at higher frequencies. Due to the homogeneous nature of the array, the KVN is also well suited for automatic calibration. In this paper we describe the automatic calibration of single-polarisation KVN data using the KVN Pipeline and comparing the results against VLBI data that has been manually reduced. We find that the pipelined data using phase transfer produces better results than a manually reduced dataset not using the phase transfer. Additionally we compared the pipeline results with a manually reduced phase-transferred dataset and found the results to be identical.

  9. The Automatic Calibration of Korean VLBI Network Data

    Science.gov (United States)

    Hodgson, Jeffrey A.; Lee, Sang-Sung; Zhao, Guang-Yao; Algaba, Juan-Carlos; Yun, Youngjoo; Jung, Taehyun; Byun, Do-Young

    2016-08-01

    The calibration of Very Long Baseline Interferometry (VLBI) data has long been a time consuming process. The Korean VLBI Network (KVN) is a simple array consisting of three identical antennas. Because four frequencies are observed simultaneously, phase solutions can be transferred from lower frequencies to higher frequencies in order to improve phase coherence and hence sensitivity at higher frequencies. Due to the homogeneous nature of the array, the KVN is also well suited for automatic calibration. In this paper we describe the automatic calibration of single-polarisation KVN data using the KVN Pipeline and comparing the results against VLBI data that has been manually reduced. We find that the pipelined data using phase transfer produces better results than a manually reduced dataset not using the phase transfer. Additionally we compared the pipeline results with a manually reduced phase-transferred dataset and found the results to be identical.

  10. Automatic Arabic Hand Written Text Recognition System

    Directory of Open Access Journals (Sweden)

    I. A. Jannoud

    2007-01-01

    Full Text Available Despite of the decent development of the pattern recognition science applications in the last decade of the twentieth century and this century, text recognition remains one of the most important problems in pattern recognition. To the best of our knowledge, little work has been done in the area of Arabic text recognition compared with those for Latin, Chins and Japanese text. The main difficulty encountered when dealing with Arabic text is the cursive nature of Arabic writing in both printed and handwritten forms. An Automatic Arabic Hand-Written Text Recognition (AHTR System is proposed. An efficient segmentation stage is required in order to divide a cursive word or sub-word into its constituting characters. After a word has been extracted from the scanned image, it is thinned and its base line is calculated by analysis of horizontal density histogram. The pattern is then followed through the base line and the segmentation points are detected. Thus after the segmentation stage, the cursive word is represented by a sequence of isolated characters. The recognition problem thus reduces to that of classifying each character. A set of features extracted from each individual characters. A minimum distance classifier is used. Some approaches are used for processing the characters and post processing added to enhance the results. Recognized characters will be appended directly to a word file which is editable form.

  11. Measuring Service Reliability Using Automatic Vehicle Location Data

    Directory of Open Access Journals (Sweden)

    Zhenliang Ma

    2014-01-01

    Full Text Available Bus service reliability has become a major concern for both operators and passengers. Buffer time measures are believed to be appropriate to approximate passengers' experienced reliability in the context of departure planning. Two issues with regard to buffer time estimation are addressed, namely, performance disaggregation and capturing passengers’ perspectives on reliability. A Gaussian mixture models based method is applied to disaggregate the performance data. Based on the mixture models distribution, a reliability buffer time (RBT measure is proposed from passengers’ perspective. A set of expected reliability buffer time measures is developed for operators by using different spatial-temporal levels combinations of RBTs. The average and the latest trip duration measures are proposed for passengers that can be used to choose a service mode and determine the departure time. Using empirical data from the automatic vehicle location system in Brisbane, Australia, the existence of mixture service states is verified and the advantage of mixture distribution model in fitting travel time profile is demonstrated. Numerical experiments validate that the proposed reliability measure is capable of quantifying service reliability consistently, while the conventional ones may provide inconsistent results. Potential applications for operators and passengers are also illustrated, including reliability improvement and trip planning.

  12. Data mining based study on quality of water level data of Three Gorges Reservoir Automatic Dispatching System%基于数据挖掘的三峡水库调度自动化系统水位数据质量研究

    Institute of Scientific and Technical Information of China (English)

    杨旭; 刘宇

    2011-01-01

    三峡水库调度自动化系统负责收集、分析近两百个遥测水位数据,且目前能够以30 s~10 min的周期进行数据采集、传输.但是由于设备、通信等原因,异常数据将会在系统中产生,有时也会缺数,这些因素对数据质量有一定的影响.而在海量数据中进行人工错误数据查找,不太现实.为解决此问题,本文引入完整率和有效性来衡量数据质量,利用数据挖掘技术进行了可行性分析,旨在为解决同类问题提供参考.%Three Gorges Reservoir Automatic Dispatching System is responsible for collecting and analyzing nearly 200 telemetry water level data, and that the data collection and transmission can be made with the frequencies from 30 s to 10 min at present However, some abnormal data will always occur in the system and sometimes even miss some data due to the relevant causations from the equipment, communication, which have certain impacts on the data quality. Nevertheless, it is not realistic to manually find the error data from the related mass data. For solving this problem, the concept of integrity and effectiveness is introduced herein to measure the quality of data, and then a feasibility analysis is made based on the technology of data mining, so as to provide a reference for solving the similar problems concerned

  13. A General Method for Module Automatic Testing in Avionics Systems

    Directory of Open Access Journals (Sweden)

    Li Ma

    2013-05-01

    Full Text Available The traditional Automatic Test Equipment (ATE systems are insufficient to cope with the challenges of testing more and more complex avionics systems. In this study, we propose a general method for module automatic testing in the avionics test platform based on PXI bus. We apply virtual instrument technology to realize the automatic testing and the fault reporting of signal performance. Taking the avionics bus ARINC429 as an example, we introduce the architecture of automatic test system as well as the implementation of algorithms in Lab VIEW. The comprehensive experiments show the proposed method can effectively accomplish the automatic testing and fault reporting of signal performance. It greatly improves the generality and reliability of ATE in avionics systems.

  14. Simple Approaches to Improve the Automatic Inventory of ZEBRA Crossing from Mls Data

    Science.gov (United States)

    Arias, P.; Riveiro, B.; Soilán, M.; Díaz-Vilariño, L.; Martínez-Sánchez, J.

    2015-08-01

    The city management is increasingly supported by information technologies, leading to paradigms such as smart cities, where decision-makers, companies and citizens are continuously interconnected. 3D modelling turns of great relevance when the city has to be managed making use of geospatial databases or Geographic Information Systems. On the other hand, laser scanning technology has experienced a significant growth in the last years, and particularly, terrestrial mobile laser scanning platforms are being more and more used with inventory purposes in both cities and road environments. Consequently, large datasets are available to produce the geometric basis for the city model; however, this data is not directly exploitable by management systems constraining the implementation of the technology for such applications. This paper presents a new algorithm for the automatic detection of zebra crossing. The algorithm is divided in three main steps: road segmentation (based on a PCA analysis of the points contained in each cycle of collected by a mobile laser system), rasterization (conversion of the point cloud to a raster image coloured as a function of intensity data), and zebra crossing detection (using the Hough Transform and logical constrains for line classification). After evaluating different datasets collected in three cities located in Northwest Spain (comprising 25 strips with 30 visible zebra crossings) a completeness of 83% was achieved.

  15. Automatic data generation scheme for finite-element method /FEDGE/ - Computer program

    Science.gov (United States)

    Akyuz, F.

    1970-01-01

    Algorithm provides for automatic input data preparation for the analysis of continuous domains in the fields of structural analysis, heat transfer, and fluid mechanics. The computer program utilizes the natural coordinate systems concept and the finite element method for data generation.

  16. Spectral Curve Fitting for Automatic Hyperspectral Data Analysis

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    Automatic discovery and curve fitting of absorption bands in hyperspectral data can enable the analyst to identify materials present in a scene by comparison with library spectra. This procedure is common in laboratory spectra, but is challenging for sparse hyperspectral data. A procedure for robust discovery of overlapping bands in hyperspectral data is described in this paper. The method is capable of automatically discovering and fitting symmetric absorption bands, can separate overlapping absorption bands in a stable manner, and has relatively low sensitivity to noise. A comparison with techniques already available in the literature is presented using simulated spectra. An application is demonstrated utilizing the shortwave infrared (2.0-2.5 micron or 5000-4000 cm-1) region. A small hyperspectral scene is processed to demonstrate the ability of the method to detect small shifts in absorption wavelength caused by varying white mica chemistry in a natural setting.

  17. Automatic neutron PSD transmission from a process computer to a timeshare system

    International Nuclear Information System (INIS)

    A method for automatically telephoning, connecting, and transmitting neutron power-spectral density data from a CDC-1700 process control computer to a PDP-10 time-share system is described. Detailed program listings and block diagrams are included

  18. SABER-School Finance: Data Collection Instrument

    Science.gov (United States)

    King, Elizabeth; Patrinos, Harry; Rogers, Halsey

    2015-01-01

    The aim of the SABER-school finance initiative is to collect, analyze and disseminate comparable data about education finance systems across countries. SABER-school finance assesses education finance systems along six policy goals: (i) ensuring basic conditions for learning; (ii) monitoring learning conditions and outcomes; (iii) overseeing…

  19. Automatic reference level control for an antenna pattern recording system

    Science.gov (United States)

    Lipin, R., Jr.

    1971-01-01

    Automatic gain control system keeps recorder reference levels within 0.2 decibels during operation. System reduces recorder drift during antenna radiation distribution determinations over an eight hour period.

  20. Development and field-testing of the BENTO box: A new satellite-linked data collection system for volcano monitoring

    Science.gov (United States)

    Roman, D. C.; Behar, A.; Elkins-Tanton, L. T.; Fouch, M. J.

    2013-12-01

    Currently it is impossible to monitor all of Earth's hazardous volcanoes for precursory eruption signals, and it is particularly difficult to monitor volcanoes in remote regions. The primary constraint is the high cost of deploying monitoring instrumentation (e.g., seismometers, gas sensors), which includes the cost of reliable, high-resolution sensors, the cost of maintenance (including periodic travel to remote areas), and the cost/difficulty of developing remote data telemetry. We are developing an integrated monitoring system, the BENTO (Behar's ENvironmental Telemetry and Observation) box that will allow identification of restless volcanoes through widespread deployment of robust, lightweight, low-cost, easily deployable monitoring/telemetry systems. Ultimately, we expect that this strategy will lead to more efficient allocation of instrumentation and associated costs. BENTO boxes are portable, autonomous, self-contained data collection systems are designed for long-term operation (up to ~12 months) in remote environments. They use low-cost two-way communication through the commercial Iridium satellite network, and, depending on data types, can pre-process raw data onboard to obtain useful summary statistics for transmission through Iridium. BENTO boxes also have the ability to receive commands through Iridium, allowing, for example, remote adjustment of sampling rates, or requests for segments of raw data in cases where only summary statistics are routinely transmitted. Currently, BENTO boxes can measure weather parameters (e.g., windspeed, wind direction, rainfall, humidity, atmospheric pressure), volcanic gas (CO2, SO2, and halogens) concentrations, and seismicity. In the future, we plan to interface BENTO boxes with additional sensors such as atmospheric pressure/infrasound, tilt, GPS and temperature. We are currently field-testing 'BENTO 1' boxes equipped with gas and meteorological sensors ('BENTO 1') at Telica Volcano, Nicaragua; Kilauea Volcano, Hawai

  1. Monitoring, analysis and classification of vegetation and soil data collected by a small and lightweight hyperspectral imaging system

    Science.gov (United States)

    Mönnig, Carsten

    2014-05-01

    The increasing precision of modern farming systems requires a near-real-time monitoring of agricultural crops in order to estimate soil condition, plant health and potential crop yield. For large sized agricultural plots, satellite imagery or aerial surveys can be used at considerable costs and possible time delays of days or even weeks. However, for small to medium sized plots, these monitoring approaches are cost-prohibitive and difficult to assess. Therefore, we propose within the INTERREG IV A-Project SMART INSPECTORS (Smart Aerial Test Rigs with Infrared Spectrometers and Radar), a cost effective, comparably simple approach to support farmers with a small and lightweight hyperspectral imaging system to collect remotely sensed data in spectral bands in between 400 to 1700nm. SMART INSPECTORS includes the whole remote sensing processing chain of small scale remote sensing from sensor construction, data processing and ground truthing for analysis of the results. The sensors are mounted on a remotely controlled (RC) Octocopter, a fixed wing RC airplane as well as on a two-seated Autogyro for larger plots. The high resolution images up to 5cm on the ground include spectra of visible light, near and thermal infrared as well as hyperspectral imagery. The data will be analyzed using remote sensing software and a Geographic Information System (GIS). The soil condition analysis includes soil humidity, temperature and roughness. Furthermore, a radar sensor is envisaged for the detection of geomorphologic, drainage and soil-plant roughness investigation. Plant health control includes drought stress, vegetation health, pest control, growth condition and canopy temperature. Different vegetation and soil indices will help to determine and understand soil conditions and plant traits. Additional investigation might include crop yield estimation of certain crops like apples, strawberries, pasture land, etc. The quality of remotely sensed vegetation data will be tested with

  2. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade - Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A; Combs, Stephanie E

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient's preference. In this context, the term "Big Data" evolved also in health care. The "hype" is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the "three V's": volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application. PMID:27066456

  3. Review of developments in electronic, clinical data collection and documentation systems over the last decade – Are we ready for Big Data in routine health care?

    Directory of Open Access Journals (Sweden)

    Kerstin Anne Kessel

    2016-03-01

    Full Text Available Recently, information availability has become more elaborate and wide spread, and treatment decisions are based on a multitude of factors including imaging, molecular or pathological markers, surgical results and patient’s preference. In this context the term Big Data evolved also in health care. The hype is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only a heterogeneous and voluminous amount of data must be evaluated, it is also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the three V’s: volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or post-processing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important and economically viable field of application.

  4. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade - Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A; Combs, Stephanie E

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient's preference. In this context, the term "Big Data" evolved also in health care. The "hype" is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the "three V's": volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application.

  5. Automatic control system design of laser interferometer

    Science.gov (United States)

    Lu, Qingjie; Li, Chunjie; Sun, Hao; Ren, Shaohua; Han, Sen

    2015-10-01

    There are a lot of shortcomings with traditional optical adjustment in interferometry, such as low accuracy, time-consuming, labor-intensive, uncontrollability, and bad repetitiveness, so we treat the problem by using wireless remote control system. Comparing to the traditional method, the effect of vibration and air turbulence will be avoided. In addition the system has some peculiarities of low cost, high reliability and easy operation etc. Furthermore, the switching between two charge coupled devices (CCDs) can be easily achieved with this wireless remote control system, which is used to collect different images. The wireless transmission is achieved by using Radio Frequency (RF) module and programming the controller, pulse width modulation (PWM) of direct current (DC) motor, real-time switching of relay and high-accuracy displacement control of FAULHABER motor are available. The results of verification test show that the control system has good stability with less than 5% packet loss rate, high control accuracy and millisecond response speed.

  6. Automatic early warning systems for the environment

    International Nuclear Information System (INIS)

    Computerized, continuous monitoring environmental early warning systems are complex networks that merge measurements with the information technology. Accuracy, consistency, reliability and data quality are their most important features. Several effects may disturb their characteristics: hostile environment, unreliable communications, poor quality of equipment, non qualified users or service personnel. According to our experiences, a number of measures should be taken to enhance system performances and to maintain them at the desired level. In the paper, we are presenting an analysis of system requirements, possible disturbances and corrective measures that give the main directives for the design, construction and exploitation of the environmental early warning systems. Procedures which ensure data integrity and quality are mentioned. Finally, the contemporary system approach based on the LAN/WAN network topology with Intranet/Internet software is proposed, together with case descriptions of two already operating systems, based on computer-network principle. (author)

  7. Automatic Discovery of Non-Compositional Compounds in Parallel Data

    CERN Document Server

    Melamed, I D

    1997-01-01

    Automatic segmentation of text into minimal content-bearing units is an unsolved problem even for languages like English. Spaces between words offer an easy first approximation, but this approximation is not good enough for machine translation (MT), where many word sequences are not translated word-for-word. This paper presents an efficient automatic method for discovering sequences of words that are translated as a unit. The method proceeds by comparing pairs of statistical translation models induced from parallel texts in two languages. It can discover hundreds of non-compositional compounds on each iteration, and constructs longer compounds out of shorter ones. Objective evaluation on a simple machine translation task has shown the method's potential to improve the quality of MT output. The method makes few assumptions about the data, so it can be applied to parallel data other than parallel texts, such as word spellings and pronunciations.

  8. Automatic removal of outliers in hydrologic time series and quality control of rainfall data: processing a real-time database of the Local System for Flood Monitoring in Klodzko County, Poland

    Science.gov (United States)

    Mizinski, Bartlomiej; Niedzielski, Tomasz; Kryza, Maciej; Szymanowski, Mariusz

    2013-04-01

    Real-time hydrological forecasting requires the highest quality of both hydrologic and meteorological data collected in a given river basin. Large outliers may lead to inaccurate predictions, with substantial departures between observations and prognoses considered even in short term. Although we need the correctness of both riverflow and rainfall data, they cannot be processed in the same way to produce a filtered output. Indeed, hydrologic time series at a given gauge can be interpolated in time domain after having detected suspicious values, however if no outlier has been detected at the upstream sites. In the case of rainfall data, interpolation is not suitable as we cannot verify the potential outliers at a given site against data from other sites especially in the complex terrain. This is due to the fact that very local convective events may occur, leading to large rainfall peaks at a limited space. Hence, instead of interpolating data, we rather perform a flagging procedure that only ranks outliers according to the likelihood of occurrence. Following the aforementioned assumptions, we have developed a few modules that serve a purpose of a fully automated correction of a database that is updated in real-time every 15 minutes, and the main objective of the work was to produce a high-quality database for a purpose of hydrologic rainfall-runoff modeling and ensemble prediction. The database in question is available courtesy of the County Office in Kłodzko (SW Poland), the institution which owns and maintains the Local System for Flood Monitoring in Kłodzko County. The dedicated prediction system, known as HydroProg, is now being built at the University of Wrocław (Poland). As the entire prediction system, the correction modules work automatically in real time and are developed in R language. They are plugged in to a larger IT infrastructure. Hydrologic time series, which are water levels recorded every 15 minutes at 22 gauges located in Kłodzko County, are

  9. Automatic Multimedia Creation Enriched with Dynamic Conceptual Data

    Directory of Open Access Journals (Sweden)

    Angel Martín

    2012-12-01

    Full Text Available There is a growing gap between the multimedia production and the context centric multimedia services. The main problem is the under-exploitation of the content creation design. The idea is to support dynamic content generation adapted to the user or display profile. Our work is an implementation of a web platform for automatic generation of multimedia presentations based on SMIL (Synchronized Multimedia Integration Language standard. The system is able to produce rich media with dynamic multimedia content retrieved automatically from different content databases matching the semantic context. For this purpose, we extend the standard interpretation of SMIL tags in order to accomplish a semantic translation of multimedia objects in database queries. This permits services to take benefit of production process to create customized content enhanced with real time information fed from databases. The described system has been successfully deployed to create advanced context centric weather forecasts.

  10. Automatic Road Centerline Extraction from Imagery Using Road GPS Data

    OpenAIRE

    Chuqing Cao; Ying Sun

    2014-01-01

    Road centerline extraction from imagery constitutes a key element in numerous geospatial applications, which has been addressed through a variety of approaches. However, most of the existing methods are not capable of dealing with challenges such as different road shapes, complex scenes, and variable resolutions. This paper presents a novel method for road centerline extraction from imagery in a fully automatic approach that addresses the aforementioned challenges by exploiting road GPS data....

  11. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade – Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A.; Combs, Stephanie E.

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient’s preference. In this context, the term “Big Data” evolved also in health care. The “hype” is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data – the “three V’s”: volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application. PMID:27066456

  12. Automatic diagnostic methods of nuclear reactor collected signals

    International Nuclear Information System (INIS)

    This work is the first phase of an opwall study of diagnosis limited to problems of monitoring the operating state; this allows to show all what the pattern recognition methods bring at the processing level. The present problem is the research of the control operations. The analysis of the state of the reactor gives a decision which is compared with the history of the control operations, and if there is not correspondence, the state subjected to the analysis will be said 'abnormal''. The system subjected to the analysis is described and the problem to solve is defined. Then, one deals with the gaussian parametric approach and the methods to evaluate the error probability. After one deals with non parametric methods and an on-line detection has been tested experimentally. Finally a non linear transformation has been studied to reduce the error probability previously obtained. All the methods presented have been tested and compared to a quality index: the error probability

  13. Solar energy collection system

    Science.gov (United States)

    Miller, C. G.; Stephens, J. B. (Inventor)

    1979-01-01

    A fixed, linear, ground-based primary reflector having an extended curved sawtooth-contoured surface covered with a metalized polymeric reflecting material, reflects solar energy to a movably supported collector that is kept at the concentrated line focus reflector primary. The primary reflector may be constructed by a process utilizing well known freeway paving machinery. The solar energy absorber is preferably a fluid transporting pipe. Efficient utilization leading to high temperatures from the reflected solar energy is obtained by cylindrical shaped secondary reflectors that direct off-angle energy to the absorber pipe. A seriatim arrangement of cylindrical secondary reflector stages and spot-forming reflector stages produces a high temperature solar energy collection system of greater efficiency.

  14. Reconstruction of the sea surface elevation from the analysis of the data collected by a wave radar system

    Science.gov (United States)

    Ludeno, Giovanni; Soldovieri, Francesco; Serafino, Francesco; Lugni, Claudio; Fucile, Fabio; Bulian, Gabriele

    2016-04-01

    X-band radar system is able to provide information about direction and intensity of the sea surface currents and dominant waves in a range of few kilometers from the observation point (up to 3 nautical miles). This capability, together with their flexibility and low cost, makes these devices useful tools for the sea monitoring either coastal or off-shore area. The data collected from wave radar system can be analyzed by using the inversion strategy presented in [1,2] to obtain the estimation of the following sea parameters: peak wave direction; peak period; peak wavelength; significant wave height; sea surface current and bathymetry. The estimation of the significant wave height represents a limitation of the wave radar system because of the radar backscatter is not directly related to the sea surface elevation. In fact, in the last period, substantial research has been carried out to estimate significant wave height from radar images either with or without calibration using in-situ measurements. In this work, we will present two alternative approaches for the reconstruction of the sea surface elevation from wave radar images. In particular, the first approach is based on the basis of an approximated version of the modulation transfer function (MTF) tuned from a series of numerical simulation, following the line of[3]. The second approach is based on the inversion of radar images using a direct regularised least square technique. Assuming a linearised model for the tilt modulation, the sea elevation has been reconstructed as a least square fitting of the radar imaging data[4]. References [1]F. Serafino, C. Lugni, and F. Soldovieri, "A novel strategy for the surface current determination from marine X-band radar data," IEEE Geosci.Remote Sens. Lett., vol. 7, no. 2, pp. 231-235, Apr. 2010. [2]Ludeno, G., Brandini, C., Lugni, C., Arturi, D., Natale, A., Soldovieri, F., Serafino, F. (2014). Remocean System for the Detection of the Reflected Waves from the Costa

  15. Automatic digital photo-book making system

    Science.gov (United States)

    Wang, Wiley; Teo, Patrick; Muzzolini, Russ

    2010-02-01

    The diversity of photo products has grown more than ever before. A group of photos are not only printed individually, but also can be arranged in specific order to tell a story, such as in a photo book, a calendar or a poster collage. Similar to making a traditional scrapbook, digital photo book tools allow the user to choose a book style/theme, layouts of pages, backgrounds and the way the pictures are arranged. This process is often time consuming to users, given the number of images and the choices of layout/background combinations. In this paper, we developed a system to automatically generate photo books with only a few initial selections required. The system utilizes time stamps, color indices, orientations and other image properties to best fit pictures into a final photo book. The common way of telling a story is to lay the pictures out in chronological order. If the pictures are proximate in time, they will coincide with each other and are often logically related. The pictures are naturally clustered along a time line. Breaks between clusters can be used as a guide to separate pages or spreads, thus, pictures that are logically related can stay close on the same page or spread. When people are making a photo book, it is helpful to start with chronologically grouped images, but time alone wont be enough to complete the process. Each page is limited by the number of layouts available. Many aesthetic rules also apply, such as, emphasis of preferred pictures, consistency of local image density throughout the whole book, matching a background to the content of the images, and the variety of adjacent page layouts. We developed an algorithm to group images onto pages under the constraints of aesthetic rules. We also apply content analysis based on the color and blurriness of each picture, to match backgrounds and to adjust page layouts. Some of our aesthetic rules are fixed and given by designers. Other aesthetic rules are statistic models trained by using

  16. MAD data collection - current trends.

    Energy Technology Data Exchange (ETDEWEB)

    Dementieva, I.; Evans, G.; Joachimiak, A.; Sanishvili, R.; Walsh, M. A.

    1999-09-20

    The multi-wavelength anomalous diffraction, or MAD, method of determining protein structure is becoming routine in protein crystallography. An increase in the number of tuneable synchrotrons beamlines coupled with the widespread availability position-sensitive X-ray detectors based on charged-coupled devices and having fast readout raised MAD structure determination to a new and exciting level. Ultra-fast MAD data collection is now possible. Recognition of the value of selenium for phasing protein structures and improvement of methods for incorporating selenium into proteins in the form of selenomethionine have attracted greater interest in the MAD method. Recent developments in crystallographic software are complimenting the above advances, paving the way for rapid protein structure determination. An overview of a typical MAD experiment is described here, with emphasis on the rates and quality of data acquisition now achievable at beamlines developed at third-generation synchrotrons sources.

  17. Automatic polarization control in optical sampling system

    Science.gov (United States)

    Zhao, Zhao; Yang, Aiying; Feng, Lihui

    2015-08-01

    In an optical sampling system for high-speed optical communications, polarization controlling is one of the most important parts of the system, regardless of nonlinear optical sampling or linear optical sampling. A simple method based on variance calculation of sampled data is proposed in this paper to tune the wave plates in a motor-driven polarization controller. In the experiment, an optical sampling system base on SFG in PPLN is carried for a 10Gbit/s or beyond optical data signal. The results demonstrate that, with the proposed method, the error of estimated Q factor from the sampled data is least, and the tuning time of optimized polarization state is less than 30 seconds with the accuracy of +/-1°.

  18. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    International Nuclear Information System (INIS)

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same

  19. Data collection architecture for big data - A framework for a research agenda

    NARCIS (Netherlands)

    Hofman, W.J.

    2015-01-01

    As big data is expected to contribute largely to economic growth, scalability of solutions becomes apparent for deployment by organisations. It requires automatic collection and processing of large, heterogeneous data sets of a variety of resources, dealing with various aspects like improving qualit

  20. Development of a System for Automatic Facial Expression Analysis

    Science.gov (United States)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  1. Design of Combat System Calibration Data Collection and Analysis System%作战系统标校数据采集与分析系统设计

    Institute of Scientific and Technical Information of China (English)

    许晓华; 李鹏

    2015-01-01

    The calibration data of a naval vessel combat system is the reflection of combat system’s function and tech‐nology state of every system equipment .Aiming at the actuality of lack of collection and analysis of the calibration data of a combat system ,a system for calibration data collection and analysis is designed .This paper establishes calibration data entry specification ,designs data analysis applications and analyses processes data .Analysis results show that this system provides a strong support for the management and application of the combat system’s calibration data ,provides references for the rele‐vant decision makers about the equipment maintenance and development .%舰艇作战系统标校数据是作战系统功能、各项系统装备技术状态的反映,针对目前没有对作战系统标校数据进行有效采集与分析的现状,设计了一种标校数据采集与分析系统。文章建立了标校数据录入规范,设计了数据分析功能,对数据进行了分析处理,从而实现了标校数据的有效利用。分析结果证明:该系统对作战系统标校数据的管理和应用提供了支撑,可为装备维护和改进的相关决策人员提供参考。

  2. Generalisation and extension of a web-based data collection system for clinical studies using Java and CORBA.

    Science.gov (United States)

    Eich, H P; Ohmann, C

    1999-01-01

    Inadequate informatical support of multi-centre clinical trials lead to pure quality. In order to support a multi-centre clinical trial a data collection via WWW and Internet based on Java has been developed. In this study a generalization and extension of this prototype has been performed. The prototype has been applied to another clinical trial and a knowledge server based on C+t has been integrated via CORBA. The investigation and implementation of security aspects of web-based data collection is now under evaluation.

  3. Research on Automatic Target Tracking Based on PTZ System

    Directory of Open Access Journals (Sweden)

    Ni Zhang

    2012-11-01

    Full Text Available This paper studies an algorithm of automatic target tracking based on PTZ system. Select the tracking target and set up the target motion trajectory in the video screen. Along the motion trajectory, the system controls the PTZ rotation automatically to track the target real-timely. At the same time, it adjusts the zoom to enlarge or reduce to make sure the target can display on the video screen center clearly at the suitable size. By testing on groups of video, verify the effectiveness of the automatic target tracking algorithm.

  4. The Diagnostic System of A – 604 Automatic Transmission

    Directory of Open Access Journals (Sweden)

    Czaban Jaroslaw

    2014-09-01

    Full Text Available Automatic gearbox gains increasing popularity in Europe. Little interest in diagnosis of such type of transmission in Poland results from the fact of small share in the whole market of operated cars, so there is a lack of availability of special diagnostic devices. These factors cause issues of expensive repairs, often involving a replacement of subassembly to new or aftermarket one. To a small extent some prophylactic diagnostic tests are conducted, which can eliminate future gearbox system failures. In the paper, the proposition of diagnostic system of popular A - 604 gearbox was presented. The authors are seeking for the possibility of using such type of devices to functional elaboration of gearboxes after renovation. The built system pursues the drive of the researched object, connected with simulated load, where special controller, replacing the original one, is responsible for controlling gearbox operation. This way is used to evaluate the mechanic and hydraulic parts' state. Analysis of signal runs, registered during measurements lets conclude about operation correctness, where as comparison with stock data verifies the technical state of an automatic gearbox.

  5. 汽车燃油加热器数据自动采集系统%Automatic Data Acquisition System for Automotive Fuel Oil Heater

    Institute of Scientific and Technical Information of China (English)

    张铁壁; 孙士尉; 夏国明; 马晓辉; 张学军

    2013-01-01

    为了解决目前汽车燃油加热器采集系统存在的问题,研制了一种基于RS-485总线的汽车燃油加热器数据采集系统.该系统采用触摸屏输入员工的信息、产品编号以及进行各项参数的设定;随后,采集模块将加热器的各项数据输入到PLC,并采用最小二乘法对温度测量数据进行修正.运行效果表明,系统操作简单、数据准确、适用性强,具有较高的推广价值.%In order to solve the problem that existing in current data acquisition system for automotive fuel oil heater,the data acquisition system based on RS-485 for automotive fuel oil heater has been researched and developed.In this system,the personnel information,product serial number and various parameters are input and setup by using touch screen ; all the data of the heater are input to PLC through acquisition module later; and the measurement data of temperature are corrected with the least squares method.The operation results prove that the system offers easy operation,accurate data and good applicability; it possesses higher promoting value.

  6. 15 CFR 990.43 - Data collection.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Data collection. 990.43 Section 990.43... DAMAGE ASSESSMENTS Preassessment Phase § 990.43 Data collection. Trustees may conduct data collection and analyses that are reasonably related to Preassessment Phase activities. Data collection and analysis...

  7. Design for Automatic Fire Alarm and Linkage Control System in A Data Center%某数据中心火灾自动报警及联动控制系统设计

    Institute of Scientific and Technical Information of China (English)

    王绍红; 陶悦

    2015-01-01

    A data center is taken as an example to introduce the design for the automatic fire alarm system in a data center, focusing on the setting of detector and fire linkage system in the building: a linkage gas fire extinguishing system with aspirating smoke detector and smoke detector is adopted in the data room, and a linkage water cannon fire extinguishing system with dual-band infrared flame detector is adopted in the sharing atrium; and then, the linkage control procedures for gas fire extinguishing system, high-pressure water mist fire extinguishing system, and intelligent fire extinguishing system in large space are introduced.%以某数据中心工程为例,介绍数据中心的火灾自动报警系统设计,着重阐述建筑物内探测器及联动灭火系统的设置:数据机房采用吸气式感烟火灾探测器与感烟探测器的组合联动气体灭火系统;共享中庭采用双波段红外火焰探测器联动自动水炮灭火系统。并介绍了高压细水雾灭火系统、气体灭火系统、大空间智能灭火系统的联动控制程序。

  8. An automatic system for acidity determination based on sequential injection titration and the monosegmented flow approach.

    Science.gov (United States)

    Kozak, Joanna; Wójtowicz, Marzena; Gawenda, Nadzieja; Kościelniak, Paweł

    2011-06-15

    An automatic sequential injection system, combining monosegmented flow analysis, sequential injection analysis and sequential injection titration is proposed for acidity determination. The system enables controllable sample dilution and generation of standards of required concentration in a monosegmented sequential injection manner, sequential injection titration of the prepared solutions, data collecting, and handling. It has been tested on spectrophotometric determination of acetic, citric and phosphoric acids with sodium hydroxide used as a titrant and phenolphthalein or thymolphthalein (in the case of phosphoric acid determination) as indicators. Accuracy better than |4.4|% (RE) and repeatability better than 2.9% (RSD) have been obtained. It has been applied to the determination of total acidity in vinegars and various soft drinks. The system provides low sample (less than 0.3 mL) consumption. On average, analysis of a sample takes several minutes. PMID:21641455

  9. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand’s Official Statistics System

    Directory of Open Access Journals (Sweden)

    Frank Pega

    2013-01-01

    Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.

  10. SYSTEM FOR AUTOMATIC GENERALIZATION OF TOPOGRAPHIC MAPS

    Institute of Scientific and Technical Information of China (English)

    YAN Hao-wen; LI Zhi-lin; AI Ting-hua

    2006-01-01

    With the construction of spatial data infrastructure, automated topographic map generalization becomes an indispensable component in the community of cartography and geographic information science. This paper describes a topographic map generalization system recently developed by the authors. The system has the following characteristics: 1) taking advantage of three levels of automation, i.e. fully automated generalization, batch generalization,and interactive generalization, to undertake two types of processes, i.e. intelligent inference process and repetitive operation process in generalization; 2) making use of two kinds of sources for generalizing rule library, i.e. written specifications and cartographers' experiences, to define a six-element structure to describe the rules; 3) employing a hierarchical structure for map databases, logically and physically; 4) employing a grid indexing technique and undo/redo operation to improve database retrieval and object generalization efficiency. Two examples of topographic map generalization are given to demonstrate the system. It reveals that the system works well. In fact, this system has been used for a number of projects and it has been found that a great improvement in efficiency compared with traditional map generalization process can be achieved.

  11. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  12. Towards Automatic Music Transcription: Extraction of MIDI-Data out of Polyphonic Piano Music

    Directory of Open Access Journals (Sweden)

    Jens Wellhausen

    2005-06-01

    Full Text Available Driven by the increasing amount of music available electronically the need of automatic search and retrieval systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications and music analysis. The first part of the algorithm performs a note accurate temporal audio segmentation. The resulting segments are examined to extract the notes played in the second part. An algorithm for chord separation based on Independent Subspace Analysis is presented. Finally, the results are used to build a MIDI file.

  13. Automatic radiation measuring system connected with GPS

    International Nuclear Information System (INIS)

    The most serious nuclear disaster in Japan has broken out at Fukushima Daiichi Nuclear Power Plant due to Great East Japan Earthquake. Prompt and exact mapping of the contamination is of great importance for radiation protection and for the environment restoration. We have developed radiation survey systems KURAMA and KURAMA-2 for rapid and exact measurement of radiation dose distribution. The system is composed of a mobile radiation monitor and the computer in office which is for the storage and visualization of the data. They are connected with internet and are operated for continuous radiation measurement while the monitor is moving. The mobile part consists of a survey meter, an interface to transform the output of the survey meter for the computer, a global positioning system, a computer to process the data for connecting to the network, and a mobile router. Thus they are effective for rapid mapping of the surface contamination. The operation and the performance of the equipment at the site are presented. (J.P.N.)

  14. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  15. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas

    OpenAIRE

    Hsien-Tsung Chang; Yi-Ming Chang; Meng-Tze Tsai

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intention...

  16. Automatic graphene transfer system for improved material quality and efficiency

    OpenAIRE

    Alberto Boscá; Jorge Pedrós; Javier Martínez; Tomás Palacios; Fernando Calle

    2015-01-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The proce...

  17. Longline Observer Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — LODS, the Hawaii Longline Observer Data System, is a complete suite of tools designed to collect, process, and manage quality fisheries data and information. Guided...

  18. A Wireless Framework for Lecturers' Attendance System with Automatic Vehicle Identification (AVI Technology

    Directory of Open Access Journals (Sweden)

    Emammer Khamis Shafter

    2015-10-01

    Full Text Available Automatic Vehicle Identification (AVI technology is one type of Radio Frequency Identification (RFID method which can be used to significantly improve the efficiency of lecturers' attendance system. It provides the capability of automatic data capture for attendance records using mobile device equipped in users’ vehicle. The intent of this article is to propose a framework for automatic lecturers' attendance system using AVI technology. The first objective of this work involves gathering of requirements for Automatic Lecturers' Attendance System and to represent them using UML diagrams. The second objective is to put forward a framework that will provide guidelines for developing the system. A prototype has also been created as a pilot project.

  19. Design and Implementation of Urban Planning and Mapping Results Data Automatic Generation System%城市规划测绘成果资料自动化生成系统设计与实现

    Institute of Scientific and Technical Information of China (English)

    吴凯华; 程相兵; 黄昀鹏; 谢武强

    2015-01-01

    With the development and popularization of computer technology , the informatization of surveying and mapping has become a trend today .In the light of the way for urban planning and mapping results data ,according to the actual needs of the production units ,carries on the software code Visual Studio 2013 platform based on the C#language and SQL Server 2008 database management platform ,using .NET and office components of the secondary development of other series version of microsoft office word .Design and implementation of urban planning and mapping results data auto-matic generation system .The software system can automatically generate urban planning surveying and mapping results data ,through the practical application of this unit in many aspects of engineering measuring team production ,validation of the advanced and practicability of the software .%随着计算机技术的发展和普及,信息化测绘已成为当今的一种趋势。针对城市规划测绘成果资料的整理方式,根据生产单位的实际需求,基于SQL Server 2008数据库管理平台和Visual Studio 2013平台的C#语言进行软件编码。利用.NET和office组件对Microsoft Office Word等多系列版本的二次开发,设计和实现了城市规划测绘成果资料自动化生成系统软件。该软件系统能够自动化生成城市规划测绘成果资料,通过本单位测量队工程生产多方面的实际应用,验证了该软件的先进性和实用性。

  20. Automatic Generation of OWL Ontology from XML Data Source

    CERN Document Server

    Yahia, Nora; Ahmed, AbdelWahab

    2012-01-01

    The eXtensible Markup Language (XML) can be used as data exchange format in different domains. It allows different parties to exchange data by providing common understanding of the basic concepts in the domain. XML covers the syntactic level, but lacks support for reasoning. Ontology can provide a semantic representation of domain knowledge which supports efficient reasoning and expressive power. One of the most popular ontology languages is the Web Ontology Language (OWL). It can represent domain knowledge using classes, properties, axioms and instances for the use in a distributed environment such as the World Wide Web. This paper presents a new method for automatic generation of OWL ontology from XML data sources.

  1. Automatic feed system for ultrasonic machining

    Science.gov (United States)

    Calkins, Noel C.

    1994-01-01

    Method and apparatus for ultrasonic machining in which feeding of a tool assembly holding a machining tool toward a workpiece is accomplished automatically. In ultrasonic machining, a tool located just above a workpiece and vibrating in a vertical direction imparts vertical movement to particles of abrasive material which then remove material from the workpiece. The tool does not contact the workpiece. Apparatus for moving the tool assembly vertically is provided such that it operates with a relatively small amount of friction. Adjustable counterbalance means is provided which allows the tool to be immobilized in its vertical travel. A downward force, termed overbalance force, is applied to the tool assembly. The overbalance force causes the tool to move toward the workpiece as material is removed from the workpiece.

  2. Automatic Identification of Critical Data Items in a Database to Mitigate the Effects of Malicious Insiders

    Science.gov (United States)

    White, Jonathan; Panda, Brajendra

    A major concern for computer system security is the threat from malicious insiders who target and abuse critical data items in the system. In this paper, we propose a solution to enable automatic identification of critical data items in a database by way of data dependency relationships. This identification of critical data items is necessary because insider threats often target mission critical data in order to accomplish malicious tasks. Unfortunately, currently available systems fail to address this problem in a comprehensive manner. It is more difficult for non-experts to identify these critical data items because of their lack of familiarity and due to the fact that data systems are constantly changing. By identifying the critical data items automatically, security engineers will be better prepared to protect what is critical to the mission of the organization and also have the ability to focus their security efforts on these critical data items. We have developed an algorithm that scans the database logs and forms a directed graph showing which items influence a large number of other items and at what frequency this influence occurs. This graph is traversed to reveal the data items which have a large influence throughout the database system by using a novel metric based formula. These items are critical to the system because if they are maliciously altered or stolen, the malicious alterations will spread throughout the system, delaying recovery and causing a much more malignant effect. As these items have significant influence, they are deemed to be critical and worthy of extra security measures. Our proposal is not intended to replace existing intrusion detection systems, but rather is intended to complement current and future technologies. Our proposal has never been performed before, and our experimental results have shown that it is very effective in revealing critical data items automatically.

  3. Remanufacturing system based on totally automatic MIG surfacing via robot

    Institute of Scientific and Technical Information of China (English)

    ZHU Sheng; GUO Ying-chun; YANG Pei

    2005-01-01

    Remanufacturing system is a term of green system project which conforms to the national sustainable development strategy. With the demand of the high adaptability of the varieties of waste machining parts, the short product cycle, the low machining cost and the high product quality are offered. Each step of the remanufacturing system from the beginning of the scanning to the accomplishment of the welding was investigted. Aiming at building a remanufacturing system based on totally automatic MIG surfacing via robot, advanced information technology, remanufacturing technology and management, through the control of the pretreatment and the optimization to minimize the time of remanufacturing and realize the remanufacturing on the terminal products of varieties, were applied. The steps mainly include: 1) using the visual sensor which is installed at the end of the Robot to rapidly get the outline data of the machining part and the pretreatment of the data; 2) rebuilding the curved surface based on the outline data and the integrated CAD material object model; 3) building the remanufacturing model based on the CAD material object model and projecting the remanufacturing process; and 4) accomplishing the remanufacture of the machining part by the technology of MIG surfacing.

  4. Automatic control system of the radiometric system for inspection of large-scale vehicles and cargoes

    International Nuclear Information System (INIS)

    The automatic control system (ACS) is intended to control the equipment of the radiometric inspection system in the normal operating modes as well as during the preventive maintenance, maintenance/repair and adjustment works; for acquisition of the data on the status of the equipment, reliable protection of the personnel and equipment, acquisition, storage and processing of the results of operation and to ensure service maintenance.

  5. Automatic classification of oranges using image processing and data mining techniques

    OpenAIRE

    Mercol, Juan Pablo; Gambini, María Juliana; Santos, Juan Miguel

    2008-01-01

    Data mining is the discovery of patterns and regularities from large amounts of data using machine learning algorithms. This can be applied to object recognition using image processing techniques. In fruits and vegetables production lines, the quality assurance is done by trained people who inspect the fruits while they move in a conveyor belt, and classify them in several categories based on visual features. In this paper we present an automatic orange’s classification system, which us...

  6. Channel Access Algorithm Design for Automatic Identification System

    Institute of Scientific and Technical Information of China (English)

    Oh Sang-heon; Kim Seung-pum; Hwang Dong-hwan; Park Chan-sik; Lee Sang-jeong

    2003-01-01

    The Automatic Identification System (AIS) is a maritime equipment to allow an efficient exchange of the navigational data between ships and between ships and shore stations. It utilizes a channel access algorithm which can quickly resolve conflicts without any intervention from control stations. In this paper, a design of channel access algorithm for the AIS is presented. The input/output relationship of each access algorithm module is defined by drawing the state transition diagram, dataflow diagram and flowchart based on the technical standard, ITU-R M.1371. In order to verify the designed channel access algorithm, the simulator was developed using the C/C++ programming language. The results show that the proposed channel access algorithm can properly allocate transmission slots and meet the operational performance requirements specified by the technical standard.

  7. ATLAS Offline Data Quality System Upgrade

    CERN Document Server

    Farrell, Steve

    2012-01-01

    The ATLAS data quality software infrastructure provides tools for prompt investigation of and feedback on collected data and propagation of these results to analysis users. Both manual and automatic inputs are used in this system. In 2011, we upgraded our framework to record all issues affecting the quality of the data in a manner which allows users to extract as much information (of the data) for their particular analyses as possible. By improved recording of issues, we are allowed the ability to reassess the impact of the quality of the data on different physics measurements and adapt accordingly. We have gained significant experience with collision data operations and analysis; we have used this experience to improve the data quality system, particularly in areas of scaling and user interface. This document describes the experience gained in assessing and recording of the data quality of ATLAS and subsequent benefits to the analysis users.

  8. 40 CFR 51.365 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Data collection. 51.365 Section 51.365 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR....365 Data collection. Accurate data collection is essential to the management, evaluation,...

  9. Innovative Data Collection Strategies in Qualitative Research

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Leech, Nancy L.; Collins, Kathleen M. T.

    2010-01-01

    This article provides an innovative meta-framework comprising strategies designed to guide qualitative data collection in the 21st century. We present a meta-framework comprising strategies for collecting data from interviews, focus groups, observations, and documents/material culture. We present a template for collecting nonverbal data during…

  10. Neuro-fuzzy system modeling based on automatic fuzzy clustering

    Institute of Scientific and Technical Information of China (English)

    Yuangang TANG; Fuchun SUN; Zengqi SUN

    2005-01-01

    A neuro-fuzzy system model based on automatic fuzzy clustering is proposed.A hybrid model identification algorithm is also developed to decide the model structure and model parameters.The algorithm mainly includes three parts:1) Automatic fuzzy C-means (AFCM),which is applied to generate fuzzy rules automatically,and then fix on the size of the neuro-fuzzy network,by which the complexity of system design is reducesd greatly at the price of the fitting capability;2) Recursive least square estimation (RLSE).It is used to update the parameters of Takagi-Sugeno model,which is employed to describe the behavior of the system;3) Gradient descent algorithm is also proposed for the fuzzy values according to the back propagation algorithm of neural network.Finally,modeling the dynamical equation of the two-link manipulator with the proposed approach is illustrated to validate the feasibility of the method.

  11. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    Directory of Open Access Journals (Sweden)

    Wildeman Maarten A

    2011-08-01

    Full Text Available Background Data collection by Electronic Medical Record (EMR systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a Clinical Trial Data Management service (CTDMS composed of electronic Case Report Forms (eCRF can result in effective data collection and treatment monitoring. Methods Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both Primary and Secondary items, over the first five month of the trial. Results In the first five months 51 patients were entered. The Primary data error rate was 1.6%, whilst that for Secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. Conclusion The presented analysis shows that after five months since the introduction of the CTDMS the Primary and Secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols.

  12. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    Science.gov (United States)

    2011-01-01

    Background Data collection by Electronic Medical Record (EMR) systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a Clinical Trial Data Management service (CTDMS) composed of electronic Case Report Forms (eCRF) can result in effective data collection and treatment monitoring. Methods Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both Primary and Secondary items, over the first five month of the trial. Results In the first five months 51 patients were entered. The Primary data error rate was 1.6%, whilst that for Secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. Conclusion The presented analysis shows that after five months since the introduction of the CTDMS the Primary and Secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols. PMID:21824421

  13. Truck Roll Stability Data Collection and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, SS

    2001-07-02

    The principal objective of this project was to collect and analyze vehicle and highway data that are relevant to the problem of truck rollover crashes, and in particular to the subset of rollover crashes that are caused by the driver error of entering a curve at a speed too great to allow safe completion of the turn. The data are of two sorts--vehicle dynamic performance data, and highway geometry data as revealed by vehicle behavior in normal driving. Vehicle dynamic performance data are relevant because the roll stability of a tractor trailer depends both on inherent physical characteristics of the vehicle and on the weight and distribution of the particular cargo that is being carried. Highway geometric data are relevant because the set of crashes of primary interest to this study are caused by lateral acceleration demand in a curve that exceeds the instantaneous roll stability of the vehicle. An analysis of data quality requires an evaluation of the equipment used to collect the data because the reliability and accuracy of both the equipment and the data could profoundly affect the safety of the driver and other highway users. Therefore, a concomitant objective was an evaluation of the performance of the set of data-collection equipment on the truck and trailer. The objective concerning evaluation of the equipment was accomplished, but the results were not entirely positive. Significant engineering apparently remains to be done before a reliable system can be fielded. Problems were identified with the trailer to tractor fiber optic connector used for this test. In an over-the-road environment, the communication between the trailer instrumentation and the tractor must be dependable. In addition, the computer in the truck must be able to withstand the rigors of the road. The major objective--data collection and analysis--was also accomplished. Using data collected by instruments on the truck, a ''bad-curve'' database can be generated. Using

  14. Automatic calibration system for VENUS lead glass counters

    International Nuclear Information System (INIS)

    Automatic calibration system for VENUS lead glass counters has been constructed. It consists of a moving table, position sensors, control electronics and a master minicomputer, (micro-11 of DEC). The system has been well operated for six months and one third of VENUS lead glass counters have been calibrated. (author)

  15. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  16. Automatic power distribution system for Okinawa Electric Power Co.; Okinawa Denryoku (kabu) nonyu haiden jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-02-29

    The open distributed automatic power distribution systems were delivered to Naha and Gushikawa branches of Okinawa Electric Power Co. This system adopts such latest technologies as object-oriented design. Its features are as follows: (1) Possible parallel operation by every branch and business office by switching an operation priority between the branch and business office in the case of multi- accidents, (2) Possible free console operation for any businesses regardless of the other console conditions, (3) Automatic decision of power supply by mobile power vehicle for precise power interruption control, (4) Immediate display of the work planning system including data maintenance and operation procedures until the prearranged working date, and (5) Possible manned backup operation at system down of the server. (translated by NEDO)

  17. Automatic Identification of Antibodies in the Protein Data Bank

    Institute of Scientific and Technical Information of China (English)

    LI Xun; WANG Renxiao

    2009-01-01

    An automatic method has been developed for identifying antibody entries in the protein data bank (PDB). Our method, called KIAb (Keyword-based Identification of Antibodies), parses PDB-format files to search for particular keywords relevant to antibodies, and makes judgment accordingly. Our method identified 780 entries as antibodies on the entire PDB. Among them, 767 entries were confirmed by manual inspection, indicating a high success rate of 98.3%. Our method recovered basically all of the entries compiled in the Summary of Antibody Crystal Structures (SACS) database. It also identified a number of entries missed by SACS. Our method thus provides a more com-plete mining of antibody entries in PDB with a very low false positive rate.

  18. Evolutionary synthesis of automatic classification on astroinformatic big data

    Science.gov (United States)

    Kojecky, Lumir; Zelinka, Ivan; Saloun, Petr

    2016-06-01

    This article describes the initial experiments using a new approach to automatic identification of Be and B[e] stars spectra in large archives. With enormous amount of these data it is no longer feasible to analyze it using classical approaches. We introduce an evolutionary synthesis of the classification by means of analytic programming, one of methods of symbolic regression. By this method, we synthesize the most suitable mathematical formulas that approximate chosen samples of the stellar spectra. As a result is then selected the category whose formula has the lowest difference compared to the particular spectrum. The results show us that classification of stellar spectra by means of analytic programming is able to identify different shapes of the spectra.

  19. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) from January 1, 1995 to August 1, 2011 (NCEI Accession 0052765)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 26...

  20. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  1. Information Collection System of Crop Growth Environment Based on the Internet of Things

    Institute of Scientific and Technical Information of China (English)

    Hua; YU; Guangyu; ZHANG; Ningbo; LU

    2013-01-01

    Based on the technology of Internet of things, for the issues of large amount data acquisition and difficult real time transport in the data acquisition of crop growth environment, this paper designs one information collection system for crop growth environment. Utilizing the range free location mechanism which defines the node position and GEAR routing mechanism give solutions to the problems of node location, routing protocol applications and so on. This system can realize accurate and automatic real time collection, aggregation and transmission of crop growth environment information, and can achieve the automation of agricultural production, to the maximum extent.

  2. Science data collection with polarimetric SAR

    DEFF Research Database (Denmark)

    Dall, Jørgen; Woelders, Kim; Madsen, Søren Nørvang

    1996-01-01

    Discusses examples on the use of polarimetric SAR in a number of Earth science studies. The studies are presently being conducted by the Danish Center for Remote Sensing. A few studies of the European Space Agency's EMAC programme are also discussed. The Earth science objectives are presented......, and the potential of polarimetric SAR is discussed and illustrated with data collected by the Danish airborne EMISAR system during a number of experiments in 1994 and 1995. The presentation will include samples of data acquired for the different studies...

  3. CrespoDynCoopNet DATA Collections

    OpenAIRE

    Crespo Solana, Ana; Sánchez-Crespo Camacho, Juan Manuel; Maestre Martínez, Roberto

    2010-01-01

    The collected data are stored into a Microsoft Access® database that has been designed to be physically integrated into a GIS system. The main structure of this initial database is built around the main table, named ‘AGENTS’, in which all biographic data related to the individual agents are entered taking into account the various ‘worlds’ each agent belongs to – social, economic etc. An individual study and classification has been carried out for each agent; then an attempt has been made to u...

  4. The validity of a monitoring system based on routinely collected dairy cattle health data relative to a standardized herd check.

    Science.gov (United States)

    Brouwer, H; Stegeman, J A; Straatsma, J W; Hooijer, G A; Schaik, G van

    2015-11-01

    Dairy cattle health is often assessed during farm visits. However, farm visits are time consuming and cattle health is assessed at only one point in time. Moreover, farm visits are poorly comparable and/or repeatable when inspection is carried out by many different professionals. Many countries register cattle health parameters such as bulk milk somatic cell count (BMSCC) and mortality in central databases. A great advantage of such routinely available data is that they are uniformly gathered and registered throughout time. This makes comparison between dairy cattle herds possible and could result in opportunities to develop reliable tools for assessing cattle health based on routinely available data. In 2005, a monitoring system for the assessment of cattle health in Dutch dairy herds based on routinely available data was developed. This system had to serve as an alternative for the compulsory quarterly farm visits, which were implemented in 2002. However, before implementation of the alternative system for dairy cows, the validity of the data-based monitoring system and the compulsory quarterly visits relative to the real health status of the herd should be known. The aim of this study was to assess the validity of the data-based monitoring system and the compulsory quarterly visits relative to a standardized herd check for detecting dairy herds with health problems. The results showed that routinely available data can be used to develop an effective screening instrument for detecting herds with poor cattle health. Routinely available data such as cattle mortality and BMSCC that were used in this study had a significant association with animal-based measurements such as the general health impression of the dairy cows (including e.g. rumen fill and body condition). Our study supports the view that cattle health parameters based on routinely available data can serve as a tool for detecting herds with a poor cattle health status which can reduce the number of

  5. Design of automatic leveling and centering system of theodolite

    Science.gov (United States)

    Liu, Chun-tong; He, Zhen-Xin; Huang, Xian-xiang; Zhan, Ying

    2012-09-01

    To realize the theodolite automation and improve the azimuth Angle measurement instrument, the theodolite automatic leveling and centering system with the function of leveling error compensation is designed, which includes the system solution, key components selection, the mechanical structure of leveling and centering, and system software solution. The redesigned leveling feet are driven by the DC servo motor; and the electronic control center device is installed. Using high precision of tilt sensors as horizontal skew detection sensors ensures the effectiveness of the leveling error compensation. Aiming round mark center is located using digital image processing through surface array CCD; and leveling measurement precision can reach the pixel level, which makes the theodolite accurate centering possible. Finally, experiments are conducted using the automatic leveling and centering system of the theodolite. The results show the leveling and centering system can realize automatic operation with high centering accuracy of 0.04mm.The measurement precision of the orientation angle after leveling error compensation is improved, compared with that of in the traditional method. Automatic leveling and centering system of theodolite can satisfy the requirements of the measuring precision and its automation.

  6. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  7. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  8. A Study of Applications of Multiagent System Specificaitons and the Key Techniques in Automatic Abstracts System

    Institute of Scientific and Technical Information of China (English)

    HUShun-geng; ZHONGYi-xin

    2001-01-01

    In this thesis, multiagent system specifications, multiagent system architectures, agent communica-tion languages and agent communication protocols, automatic abstracting based on multiagent technolo-gies are studied.Some concerned problems of de-signs and realization of automatic abstracting sys-tems based on multiagent technologies are strdied, too.Chapter 1 shows the significance and objectives of the thesis, its main contents are summarized, and innovations of the thesis are showed.Some basic concepts of agents and multiagent systems are stud-ied in Chapter2.The definitions of agents and mul-tiagent systems are given, and the theory, technolo-gies and applications of multiagent systems are sum-marized .Furthermore, some important studying trends of multiagent systems are set forward.Multi-agent system specifications are strdied in Chapter30MAS/KIB-a multiagent system specification is built using mental states such as K(Know), B(Be-lief), and I(Intention), its grammar and seman-teme are discussed, axioms and inference rules are given, and some properties are researched.We also compare MAS/KIB with other existing specifica-tions.MAS/KIB has the following characteristicsL1)each agent has its own world outlood;(2)no global data in the system;(3)processes of state changes are used as indexes to systems;(4)it has the characteristics of not only time series logic but also dynamic logic;and (5) interactive actions are included.The architectures of multiagent systems are studied in Chapter 4.First, we review some typical architecture of multiagent systems, agent network architecture, agent federated architecture, agent blackboard architenture ,and Foundation of Intelligent Physical Agent(FIPA) architecture.For the first time, we set forward and study the layering and partitioning models of the architectures of multi-agent systems,organizing architecture models, and interoperability architecture model of multiagent sys-tems .Chapter 5 studies agent communication lan

  9. EBT data acquisition and analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Burris, R.D.; Greenwood, D.E.; Stanton, J.S.; Geoffroy, K.A.

    1980-10-01

    This document describes the design and implementation of a data acquisition and analysis system for the EBT fusion experiment. The system includes data acquisition on five computers, automatic transmission of that data to a large, central data base, and a powerful data retrieval system. The system is flexible and easy to use, and it provides a fully documented record of the experiments.

  10. Automatic Extraction of Mangrove Vegetation from Optical Satellite Data

    Science.gov (United States)

    Agrawal, Mayank; Sushma Reddy, Devireddy; Prasad, Ram Chandra

    2016-06-01

    Mangrove, the intertidal halophytic vegetation, are one of the most significant and diverse ecosystem in the world. They protect the coast from sea erosion and other natural disasters like tsunami and cyclone. In view of their increased destruction and degradation in the current scenario, mapping of this vegetation is at priority. Globally researchers mapped mangrove vegetation using visual interpretation method or digital classification approaches or a combination of both (hybrid) approaches using varied spatial and spectral data sets. In the recent past techniques have been developed to extract these coastal vegetation automatically using varied algorithms. In the current study we tried to delineate mangrove vegetation using LISS III and Landsat 8 data sets for selected locations of Andaman and Nicobar islands. Towards this we made an attempt to use segmentation method, that characterize the mangrove vegetation based on their tone and the texture and the pixel based classification method, where the mangroves are identified based on their pixel values. The results obtained from the both approaches are validated using maps available for the region selected and obtained better accuracy with respect to their delineation. The main focus of this paper is simplicity of the methods and the availability of the data on which these methods are applied as these data (Landsat) are readily available for many regions. Our methods are very flexible and can be applied on any region.

  11. AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA)

    OpenAIRE

    Veena G.S; Chandrika Prasad; Khaleel K

    2013-01-01

    The proposed work aims to create a smart application camera, with the intention of eliminating the need for a human presence to detect any unwanted sinister activities, such as theft in this case. Spread among the campus, are certain valuable biometric identification systems at arbitrary locations. The application monitosr these systems (hereafter referred to as “object”) using our smart camera system based on an OpenCV platform. By using OpenCV Haar Training, employing the Vio...

  12. Reliability of the TJ-II power supply system: Collection and analysis of the operational experience data

    International Nuclear Information System (INIS)

    During a TJ-II pulse, the provision of magnetic fields requires a total amount of power exceeding 80 MVA. Such amount of power is supplied by a 132 MVA flywheel generator (15 kV output voltage, 80-100 Hz output frequency) and the related motor, transformers, breakers, rectifiers, regulators, protections, busbars, connections, etc. Failure data of these main components have been collected identified and processed including information on failure modes and, where possible, causes of the failures. Main statistical values about failure rates for the period from May of 1998 to December of 2004 have been calculated and are ready to be compared with those of the International Fusion Component Failure Rate Database (FCFR-DB)

  13. Reliability of the TJ-II power supply system: Collection and analysis of the operational experience data

    Energy Technology Data Exchange (ETDEWEB)

    Izquierdo, Jesus [Fusion Energy Engineering Laboratory, Seccio d' Enginyeria Nuclear, Universitat Politecnica de Catalunya, Avda. Diagonal 647, 08028 Barcelona (Spain)], E-mail: jesus.izquierdo@upc.edu; Dies, Javier; Garcia, Jeronimo; Tapia, Carlos [Fusion Energy Engineering Laboratory, Seccio d' Enginyeria Nuclear, Universitat Politecnica de Catalunya, Avda. Diagonal 647, 08028 Barcelona (Spain); Alonso, Javier; Ascasibar, Enrique; Medrano, Mercedes; Mendez, Purificacion; Rodriguez, Lina [Asociacion EURATOM-CIEMAT para la Fusion, Avda. Complutense 22, Madrid (Spain)

    2007-10-15

    During a TJ-II pulse, the provision of magnetic fields requires a total amount of power exceeding 80 MVA. Such amount of power is supplied by a 132 MVA flywheel generator (15 kV output voltage, 80-100 Hz output frequency) and the related motor, transformers, breakers, rectifiers, regulators, protections, busbars, connections, etc. Failure data of these main components have been collected identified and processed including information on failure modes and, where possible, causes of the failures. Main statistical values about failure rates for the period from May of 1998 to December of 2004 have been calculated and are ready to be compared with those of the International Fusion Component Failure Rate Database (FCFR-DB)

  14. Analysis on the Influence of Automatic Station Temperature Data on the Sequence Continuity of Historical Meteorological Data

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation peri...

  15. Midterm Report on Data Collection

    DEFF Research Database (Denmark)

    Gelsing, Lars; Linde, Lisbeth Tved

    In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started.......In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started....

  16. Design and development of real time data push in web-based automatic irrigation control system%基于Web的自动灌溉控制系统数据实时推送设计与开发

    Institute of Scientific and Technical Information of China (English)

    李淑华; 郝星耀; 周清波; 潘瑜春

    2015-01-01

    The automatic irrigation control system based on web is a main trend of current water-saving technology development. In order to provide personalized irrigation control scheme and precise water metering, the system needs higher real-time data transmission performance. The real-time performance of web application is currently poor, and difficult to meet the needs of accurate irrigation control. Aiming at this problem, in this paper, the structure and bottleneck of real-time data transmission of web-based automatic irrigation control system was analyzed, and the data push scheme of improving the real-time performance was proposed. Based on observer pattern, the data push mode between data layer and logic layer, and that between logic layer and presentation layer were specifically designed. In the former data transmission process, the observed object is database table, and the observer is Web server monitoring program. After the data is inserted into the data table, the database immediately triggers the stored procedure to notify the relevant Web service program and executes updating data subject. In the latter data transmit process, the observed object is program object running on the Web server, and the observer is client program object running on the browser. Because the Web service program cannot directly initiate data connection to the client program. Therefore, in order to implement the observer pattern, it is essential to establish a real-time tow-way data connection in the client program loading process. Then through subscribing a group of data subjects, the client program can receive real-time data push as soon as the data subjects are updated. The connection between client and Web server is established and maintained by client connect request, then a streaming connection is evoked by client through which Web server streams data down to the client with no poll overhead. But these client-to-server messages are not sent over the streaming connection, instead an

  17. Automatic Registration of Multi-Source Data Using Mutual Information

    Science.gov (United States)

    Parmehr, E. G.; Zhang, C.; Fraser, C. S.

    2012-07-01

    Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI) as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI) approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM) and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  18. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  19. AUTOMATIC REGISTRATION OF MULTI-SOURCE DATA USING MUTUAL INFORMATION

    Directory of Open Access Journals (Sweden)

    E. G. Parmehr

    2012-07-01

    Full Text Available Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  20. Automatic beam path analysis of laser wakefield particle acceleration data

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver; Wu, Kesheng; Prabhat; Weber, Gunther H; Ushizima, Daniela M; Hamann, Bernd; Bethel, Wes [Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720 (United States); Geddes, Cameron G R; Cormier-Michel, Estelle [LOASIS program of Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720 (United States); Messmer, Peter [Tech-X Corporation, 5621 Arapahoe Avenue Suite A, Boulder, CO 80303 (United States); Hagen, Hans [International Research Training Group ' Visualization of Large and Unstructured Data Sets-Applications in Geospatial Planning, Modeling, and Engineering' , Technische Universitaet Kaiserslautern, Erwin-Schroedinger-Strasse, D-67653 Kaiserslautern (Germany)], E-mail: oruebel@lbl.gov

    2009-01-01

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high-energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information-derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than has been possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  1. Simulation of the TREAT-Upgrade Automatic Reactor Control System

    International Nuclear Information System (INIS)

    This paper describes the design of the Automatic Reactor Control System (ARCS) for the Transient Reactor Test Facility (TREAT) Upgrade. A simulation was used to facilitate the ARCS design and to completely test and verify its operation before installation at the TREAT facility

  2. Automatic surveillance system using fish-eye lens camera

    Institute of Scientific and Technical Information of China (English)

    Xue Yuan; Yongduan Song; Xueye Wei

    2011-01-01

    This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates.Human regions are detected from the fish-eye image effectively and are corrected for perspective versions.An experiment is performed on indoor video sequences with different illumination and crowded conditions,with results demonstrating the efficiency of our algorithm.%@@ This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates. Human regions are detected from the fish-eye image effectively and are corrected for perspective versions. An experiment is performed on indoor video sequences with different illumination and crowded conditions, with results demonstrating the efficiency of our algorithm.

  3. Examination techniques of the automatics fire detection monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Yon Woo [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-04-01

    The variety of the automatic fire detection monitoring systems has been developed because the multistory buildings were constructed and the various structural materials were used. To stop the spread of the fire and minimize the damage of human life and properties of the facility, it should be informed precisely to all the members of the facility. (author). 12 refs., 28 figs.

  4. Liquid scintillation counting system with automatic gain correction

    International Nuclear Information System (INIS)

    An automatic liquid scintillation counting apparatus is described including a scintillating medium in the elevator ram of the sample changing apparatus. An appropriate source of radiation, which may be the external source for standardizing samples, produces reference scintillations in the scintillating medium which may be used for correction of the gain of the counting system

  5. Building an Image-Based System to automatically Score psoriasis

    DEFF Research Database (Denmark)

    G{'o}mez, D. Delgado; Carstensen, Jens Michael; Ersbøll, Bjarne Kjær

    2003-01-01

    images. The system is tested on patients with the dermatological disease psoriasis. Temporal series of images are taken for each patient and the lesions are automatically extracted. Results indicate that to the images obtained are a good source for obtaining derived variables to track the lesion....

  6. Auditory signal design for automatic number plate recognition system

    NARCIS (Netherlands)

    Heydra, C.G.; Jansen, R.J.; Van Egmond, R.

    2014-01-01

    This paper focuses on the design of an auditory signal for the Automatic Number Plate Recognition system of Dutch national police. The auditory signal is designed to alert police officers of suspicious cars in their proximity, communicating priority level and location of the suspicious car and takin

  7. Automatic calorimetry system monitors RF power

    Science.gov (United States)

    Harness, B. W.; Heiberger, E. C.

    1969-01-01

    Calorimetry system monitors the average power dissipated in a high power RF transmitter. Sensors measure the change in temperature and the flow rate of the coolant, while a multiplier computes the power dissipated in the RF load.

  8. Automatic Tracking Evaluation and Development System (ATEDS)

    Data.gov (United States)

    Federal Laboratory Consortium — The heart of the ATEDS network consists of four SGI Octane computers running the IRIX operating system and equipped with V12 hardware graphics to support synthetic...

  9. Evaluation of automatic exposure control systems in computed tomography

    International Nuclear Information System (INIS)

    The development of the computed tomography (CT) technology has brought wider possibilities on diagnostic medicine. It is a non-invasive method to see the human body in details. As the CT application increases, it raises the concern about patient dose, because the higher dose levels imparted compared to other diagnostic imaging modalities. The radiology community (radiologists, medical physicists and manufacturer) are working together to find the lowest dose level possible, without compromising the diagnostic image quality. The greatest and relatively new advance to lower the patient dose is the automatic exposure control (AEC) systems in CT. These systems are designed to ponder the dose distribution along the patient scanning and between patients taking into account their sizes and irradiated tissue densities. Based on the CT scanning geometry, the AEC-systems are very complex and their functioning is yet not fully understood. This work aims to evaluate the clinical performance of AEC-systems and their susceptibilities to assist on possible patient dose optimizations. The approach to evaluate the AEC-systems of three of the leading CT manufacturers in Brazil, General Electric, Philips and Toshiba, was the extraction of tube current modulation data from the DICOM standard image sequences, measurement and analysis of the image noise of those image sequences and measurement of the dose distribution along the scan length on the surface and inside of two different phantoms configurations. The tube current modulation of each CT scanner associated to the resulted image quality provides the performance of the AECsystem. The dose distribution measurements provide the dose profile due to the tube current modulation. Dose measurements with the AEC-system ON and OFF were made to quantify the impact of these systems regarding patient dose. The results attained give rise to optimizations on the AEC-systems applications and, by consequence, decreases the patient dose without

  10. Requirements to a Norwegian National Automatic Gamma Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Lauritzen, B.; Hedemann Jensen, P.; Nielsen, F

    2005-04-01

    An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of the system in terms of ambient dose equivalent rate increments above the natural background levels. The study is based upon simplified deterministic calculations of the radiological consequences of generic nuclear accident scenarios. The density of gamma monitoring stations has been estimated from an analysis of the dispersion of radioactive materials over large distances using historical weather data; the minimum density is estimated from the requirement that a radioactive plume may not slip unnoticed in between stations of the monitoring network. The sensitivity of the gamma monitoring system is obtained from the condition that events that may require protective intervention measures should be detected by the system. Action levels for possible introduction of sheltering and precautionary foodstuff restrictions are derived in terms of ambient dose equivalent rate. For emergency situations where particulates contribute with only a small fraction of the total ambient dose equivalent rate from the plume, it is concluded that measurements of dose rate are sufficient to determine the need for sheltering; simple dose rate measurements however, are inadequate to determine the need for foodstuff restrictions and spectral measurements are required. (au)

  11. Automatic Translation of Arabic Sign to Arabic Text (ATASAT) System

    OpenAIRE

    Abdelmoty M.Ahmed; Reda Abo Alez; Muhammad Taha; Gamal Tharwat

    2016-01-01

    Sign language continues to be the preferred tool of communication between the deaf and the hearing-impaired. It is a well-structured code by h and gesture, where every gesture has a specific meaning, In this paper has goal to develop a system for automatic translation of Arabic Sign Language. To Arabic Text (ATASAT) System this system is acts as a translator among deaf and dumb with normal people to enhance their commun ication, the...

  12. A ORACLE-based system for data collection, storage and analysis of main equipment load factors in NPPs and TPPs

    International Nuclear Information System (INIS)

    This data base is developed by the National Electricity Company, Sofia (BG) as an aid to supervision, analysis and administration decision making in a variety of operational situations in NPPs and TPPs. As major indicators of the equipment condition the following primary data are stored: steam or electricity production per month; operation hours per month; equipment stand-by outages; planned outages; unplanned permitted maintenance outages; unplanned emergency maintenance outages; number of outages of the unit per month. These data cover the period from the putting of the corresponding equipment into operation till the present moment, i.e. or about 32 years. The data up to 1990 are annual and for the last three years - monthly. Based on these primary data, the following quantities are calculated: average capacity; average load factors; operation time factors - total and accounting for the planned and the permitted unplanned outages; unpermitted outages factors - total and accounting for the planned and the permitted outages. All the factors are calculated on user's request for a chosen time period, by summing up correspondingly the major indicators (production, operation hours and various outages) for the given period. The system operates on an IBM 4341 under VM/SP and DB ORACLE V.5. The input is entered directly from the TPP and NPP by telex lines from PCs, operating also as telex machines, into the mainframe of Energokibernetika Ltd. They are available to all authorised users from local terminals or PCs, connected to the computer by synchronous or asynchronous lines. A system for data transmission to remote users along commutated telephone lines is also developed. (R. Ts.)

  13. Automatic Discharge system from dewatering bin; Sekitantaki boira niokeru dassuiso karano kurinka haraidashi shisutemu no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Iwasaki, Atsushi; Kawakami, Masamichi; Ito, Takayoshi [Chugoku Electric Powers, Co., Inc., Hiroshima (Japan); Kinoshita, Tetsuhiro; Enomoto, Masayuki [Kawasaki Heavy Industries, Ltd., Hyogo (Japan)

    1999-03-15

    At present, discharge of clinker ash from dewatering bins is done by an operator near the mesh conveyor, based on past experience, as the operator observes the discharged ash condition on the mesh conveyor. We have precious data relating to automatic operation with the sensor signal (current of the mesh conveyor motor, moisture of the clinker ash, image processing data, open ratio of the ash discharge gate, etc.). We studied the relation between the clinker ash condition and actual operation. Using the data, we were able to construct the [Automatic discharge system from dewatering bin]. (author)

  14. A rapid two-dimensional data collection system for the study of ferroelectric materials under external applied electric fields

    Science.gov (United States)

    Vergentev, Tikhon; Bronwald, Iurii; Chernyshov, Dmitry; Gorfman, Semen; Ryding, Stephanie H. M.; Thompson, Paul; Cernik, Robert J.

    2016-01-01

    Synchrotron X-rays on the Swiss Norwegian Beamline and BM28 (XMaS) at the ESRF have been used to record the diffraction response of the PMN–PT relaxor piezoelectric 67% Pb(Mg1/3Nb2/3)O3–33% PbTiO3 as a function of externally applied electric field. A DC field in the range 0–18 kV cm−1 was applied along the [001] pseudo-cubic direction using a specially designed sample cell for in situ single-crystal diffraction experiments. The cell allowed data to be collected on a Pilatus 2M area detector in a large volume of reciprocal space using transmission geometry. The data showed good agreement with a twinned single-phase monoclinic structure model. The results from the area detector were compared with previous Bragg peak mapping using variable electric fields and a single detector where the structural model was ambiguous. The coverage of a significantly larger section of reciprocal space facilitated by the area detector allowed precise phase analysis. PMID:27738414

  15. Study on traffic accidents mechanism with automatic recording systems. Part 2. Application of data from ADR and DMR for practical driver education; Jidosha kiroku sochi ni yoru kotsu jiko hassei mechanism no kenkyu. 2. Jiko data kirokukei (ADR) to unko kirokukei (DMR) no untensha kyoiku eno katsuyo

    Energy Technology Data Exchange (ETDEWEB)

    Ueyama, M.; Ogawa, S. [National Research Inst. of Police Science, Tokyo (Japan); Chikasue, H.; Muramatsu, K. [Yazaki Meter Co. Ltd., Tokyo (Japan)

    1997-10-01

    A field trial are carried out using automatic receding system; ADR (Accident Data Recorder) and DMR (Driving Monitoring Recorder) installed on 20 commercial vehicles, in order to assess the implications for driver behavior and accidents. The data suggest that the accident mechanism can be explained in terms of situation-specific factor and behavior of drivers just before accident that is, their attitude to the handing and control of vehicles. The data might offer a new information for practical driver education. 3 refs., 9 figs., 1 tab.

  16. AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA

    Directory of Open Access Journals (Sweden)

    Veena G.S

    2013-12-01

    Full Text Available The proposed work aims to create a smart application camera, with the intention of eliminating the need for a human presence to detect any unwanted sinister activities, such as theft in this case. Spread among the campus, are certain valuable biometric identification systems at arbitrary locations. The application monitosr these systems (hereafter referred to as “object” using our smart camera system based on an OpenCV platform. By using OpenCV Haar Training, employing the Viola-Jones algorithm implementation in OpenCV, we teach the machine to identify the object in environmental conditions. An added feature of face recognition is based on Principal Component Analysis (PCA to generate Eigen Faces and the test images are verified by using distance based algorithm against the eigenfaces, like Euclidean distance algorithm or Mahalanobis Algorithm. If the object is misplaced, or an unauthorized user is in the extreme vicinity of the object, an alarm signal is raised.

  17. The analysis and expansion of regulatory binding site data in a wide range of bacteria through the use of a semi-automatic system - RegTransBase

    OpenAIRE

    Cipriano, Michael J.

    2014-01-01

    RegTransBase, a database describing regulatory interactions in prokaryotes, has been developed as a component of the MicrobesOnline/RegTransBase framework successfully used for interpretation of microbial stress response and metal reduction pathways. It is manually curated and based on published scientific literature. RegTransBase describes a large number of regulatory interactions and contains experimental data which investigates regulation with known elements. It is available at http://reg...

  18. Towards the development of Hyperspectral Images of trench walls. Robotrench: Automatic Data acquisition

    Science.gov (United States)

    Ragona, D. E.; Minster, B.; Rockwell, T. K.; Fialko, Y.; Bloom, R. G.; Hemlinger, M.

    2004-12-01

    Previous studies on imaging spectrometry of paleoseismological excavations (Ragona, et. al, 2003, 2004) showed that low resolution Hyperspectral Imagery of a trench wall, processed with a supervised classification algorithm, provided more stratigraphic information than a high-resolution digital photography of the same exposure. Although the low-resolution images depicted the most important variations, a higher resolution hyperspectral image is necessary to assist in the recognition and documentation of paleoseismic events. Because of the fact that our spectroradiometer can only acquire one pixel at the time, creating a 25 psi image of a 1 x 1 m area of a trench wall will require 40000 individual measurements. To ease this extensive task we designed and built a device that can automatically position the spectroradiometer probe along the x-z plane of a trench wall. This device, informally named Robotrench, has two 7 feet long axes of motion (horizontal and vertical) commanded by a stepper motor controller board and a laptop computer. A platform provides the set up for the spectroradiometer probe and for the calibrated illumination system. A small circuit provided the interface between the Robotrench motion and the spectroradiomenter data collection. At its best, Robotrench ?spectroradiometer symbiotic pair can automatically record 1500-2000 pixels/hour, making the image acquisition process slow but feasible. At the time this abstract submission only a small calibration experiment was completed. This experiment was designed to calibrate the X-Z axes and to test the instrument performance. We measured a 20 x 10 cm brick wall at a 25 psi resolution. Three reference marks were set up on the trench wall as control points for the image registration process. The experiment was conducted at night under artificial light (stabilized 2 x 50 W halogen lamps). The data obtained was processed with the Spectral Angle Mapper algorithm. The image recovered from the data showed an

  19. 06091 Abstracts Collection -- Data Structures

    OpenAIRE

    Arge, Lars; Sedgewick, Robert; Wagner, Dorothea

    2006-01-01

    From 26.02.06 to 03.03.06, the Dagstuhl Seminar 06091 ``Data Structures'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goa...

  20. Collective Classification in Network Data

    OpenAIRE

    Sen, Prithviraj; Namata, Galileo; Bilgic, Mustafa; Getoor, Lise; University of Maryland; Galligher, Brian; Eliassi-Rad, Tina

    2008-01-01

    Many real-world applications produce networked data such as the world-wide web (hypertext documents connected via hyperlinks), social networks (for example, people connected by friendship links), communication networks (computers connected via communication links) and biological networks (for example, protein interaction networks). A recent focus in machine learning research has been to extend traditional machine learning classification techniques to classify nodes in such networks. In this a...

  1. Automatic speed management systems : great safety potential ?

    NARCIS (Netherlands)

    Oei, H.-l.

    1992-01-01

    An account is given of speed management experiments carried out in The Netherlands on four 2-lane rural roads with a speed limit of 80 km/h. The experiment involved an information campaign, warning signs and a radar camera system. Fixed signs advised a speed of between 60 and 80 km/h and an automati

  2. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas.

    Science.gov (United States)

    Chang, Hsien-Tsung; Chang, Yi-Ming; Tsai, Meng-Tze

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning. PMID:26839529

  3. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas

    Directory of Open Access Journals (Sweden)

    Hsien-Tsung Chang

    2016-01-01

    Full Text Available Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning.

  4. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  5. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  6. NIDDK data repository: a central collection of clinical trial data

    Directory of Open Access Journals (Sweden)

    Hall R David

    2006-04-01

    Full Text Available Abstract Background The National Institute of Diabetes and Digestive and Kidney Diseases have established central repositories for the collection of DNA, biological samples, and clinical data to be catalogued at a single site. Here we present an overview of the site which stores the clinical data and links to biospecimens. Description The NIDDK Data repository is a web-enabled resource cataloguing clinical trial data and supporting information from NIDDK supported studies. The Data Repository allows for the co-location of multiple electronic datasets that were created as part of clinical investigations. The Data Repository does not serve the role of a Data Coordinating Center, but rather as a warehouse for the clinical findings once the trials have been completed. Because both biological and genetic samples are collected from many of the studies, a data management system for the cataloguing and retrieval of samples was developed. Conclusion The Data Repository provides a unique resource for researchers in the clinical areas supported by NIDDK. In addition to providing a warehouse of data, Data Repository staff work with the users to educate them on the datasets as well as assist them in the acquisition of multiple data sets for cross-study analysis. Unlike the majority of biological databases, the Data Repository acts both as a catalogue for data, biosamples, and genetic materials and as a central processing point for the requests for all biospecimens. Due to regulations on the use of clinical data, the ultimate release of that data is governed under NIDDK data release policies. The Data Repository serves as the conduit for such requests.

  7. New Approaches to Demographic Data Collection

    OpenAIRE

    Treiman, Donald J.; Lu, Yao; Qi, Yaqiang

    2012-01-01

    As population scientists have expanded the range of topics they study, increasingly considering the interrelationship between population phenomena and social, economic, and health conditions, they have expanded the kinds of data collected and have brought to bear new data collection techniques and procedures, often borrowed from other fields. These new approaches to demographic data collection are the concern of this essay. We consider three main topics: new developments in sampling procedure...

  8. Automatic outdoor monitoring system for photovoltaic panels

    Energy Technology Data Exchange (ETDEWEB)

    Stefancich, Marco [Consiglio Nazionale delle Ricerce, Istituto dei Materiali per l’Elettronica ed il Magnetismo (CNR-IMEM), Parco Area delle Scienze 37/A, 43124 Parma, Italy; Simpson, Lin [National Renewable Energy Laboratory, 15013 Denver West Parkway, Golden, Colorado 80401, USA; Chiesa, Matteo [Masdar Institute of Science and Technology, P.O. Box 54224, Masdar City, Abu Dhabi, United Arab Emirates

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  9. Automatic outdoor monitoring system for photovoltaic panels.

    Science.gov (United States)

    Stefancich, Marco; Simpson, Lin; Chiesa, Matteo

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  10. 基于大数据分析的铁路自动售检票监控系统研究%Railway Automatic Ticketing and Gate Monitoring System based on big data analysis

    Institute of Scientific and Technical Information of China (English)

    王成; 史天运

    2015-01-01

    This article proposed the general frame of Railway Automatic Ticketing and Gate Monitoring System(RATGS). The System was consisted of 4 layers, which were the infrastructure layer, the management layer, the analysis layer and application layer. The System was introduced technologies such as multidimensional data analysis, the distributed ifle system storage MapReduce, Complex Event Processing(CEP), data mining and etc., to implement the value added services based on passenger behavior analysis, such as fault early warning, analysis of failure rate, the utilization rate analysis of equipments, business optimization analysis, OD hotspot analysis, abnormal passenger recognition, usability analysis of equipment. All of these pointed out a new method for the future development of RATGS.%本文提出铁路自动售检票监控系统总体框架由基础层、管理层、分析层和应用层组成。利用多维数据分析、分布式文件系统存储和MapReduce计算、复杂事件处理、数据挖掘等技术,实现对铁路自动售检票系统的故障预警和故障率分析、设备利用率分析、业务优化分析以及OD热点分析、异常旅客识别、设备易用性分析等以旅客行为分析为基础的增值业务,为铁路自动售检票系统的未来发展提供一种新思路。

  11. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    Science.gov (United States)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  12. Automatic computer-aided system of simulating solder joint formation

    Science.gov (United States)

    Zhao, Xiujuan; Wang, Chunqing; Zheng, Guanqun; Wang, Gouzhong; Yang, Shiqin

    1999-08-01

    One critical aspect in electronic packaging is the fatigue/creep-induced failure in solder interconnections, which is found to be highly dependent on the shape of solder joints. Thus predicting and analyzing the solder joint shape is warranted. In this paper, an automatic computer-aided system is developed to simulate the formation of solder joint and analyze the influence of the different process parameters on the solder joint shape. The developed system is capable of visually designing the process parameters and calculating the solder joint shape automatically without any intervention from the user. The automation achieved will enable fast shape estimation with the variation of process parameters without time consuming experiments, and the simulating system provides the design and manufacturing engineers an efficient software tools to design soldering process in design environment. Moreover, a program developed from the system can serve as the preprocessor for subsequent finite element joint analysis program.

  13. Audio watermarking technologies for automatic cue sheet generation systems

    Science.gov (United States)

    Caccia, Giuseppe; Lancini, Rosa C.; Pascarella, Annalisa; Tubaro, Stefano; Vicario, Elena

    2001-08-01

    Usually watermark is used as a way for hiding information on digital media. The watermarked information may be used to allow copyright protection or user and media identification. In this paper we propose a watermarking scheme for digital audio signals that allow automatic identification of musical pieces transmitted in TV broadcasting programs. In our application the watermark must be, obviously, imperceptible to the users, should be robust to standard TV and radio editing and have a very low complexity. This last item is essential to allow a software real-time implementation of the insertion and detection of watermarks using only a minimum amount of the computation power of a modern PC. In the proposed method the input audio sequence is subdivided in frames. For each frame a watermark spread spectrum sequence is added to the original data. A two steps filtering procedure is used to generate the watermark from a Pseudo-Noise (PN) sequence. The filters approximate respectively the threshold and the frequency masking of the Human Auditory System (HAS). In the paper we discuss first the watermark embedding system then the detection approach. The results of a large set of subjective tests are also presented to demonstrate the quality and robustness of the proposed approach.

  14. Interpreting sign components from accelerometer and sEMG data for automatic sign language recognition.

    Science.gov (United States)

    Li, Yun; Chen, Xiang; Zhang, Xu; Wang, Kongqiao; Yang, Jihai

    2011-01-01

    The identification of constituent components of each sign gesture is a practical way of establishing large-vocabulary sign language recognition (SLR) system. Aiming at developing such a system using portable accelerometer (ACC) and surface electromyographic (sEMG) sensors, this work proposes a method for automatic SLR at the component level. The preliminary experimental results demonstrate the effectiveness of the proposed method and the feasibility of interpreting sign components from ACC and sEMG data. Our study improves the performance of SLR based on ACC and sEMG sensors and will promote the realization of a large-vocabulary portable SLR system. PMID:22255059

  15. Robust parameter design for automatically controlled systems and nanostructure synthesis

    Science.gov (United States)

    Dasgupta, Tirthankar

    2007-12-01

    This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor

  16. The anemodata 1-IIE. Automatic system for wind data acquisition; El anemodata 1-IIE. Sistema automatico para la adquisicion de datos de viento

    Energy Technology Data Exchange (ETDEWEB)

    Borja, Marco Antonio; Parkman Cuellar, Pablo A. [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)

    1986-12-31

    Wind is an inexhaustible energy source. To study its behavior in order to develop research projects and apply new technologies connected to its maximum development is one of the activities carried on at the Instituto de Investigaciones Electricas (IIE). As a part of such activities, the equipment Anemodata-1-IIE was designed and built for the wind velocity and direction data acquisition. The Anemodata-1-IIE is the result of the work that the Departamento de Fuentes no Convencionales (Non-Conventional Energy Sources of the Energy Sources Department) carries on regarding the development of electric equipment for the anemometry. [Espanol] Una fuente inagotable de energia es el viento. Estudiar su comportamiento para desarrollar proyectos de investigacion y aplicar nuevas tecnologias vinculadas con su maximo aprovechamiento es una de las actividades que se realizan en el Instituto de Investigaciones Electricas (IIE). Como parte de dichas actividades, se diseno y construyo el equipo Anemodata-1-IIE para la adquisicion de datos de velocidad y direccion del viento. El anemodata-1-IIE es un resultado de los trabajos que el Departamento de Fuentes no Convencionales, de la division de Fuentes de Energia, lleva a cabo en torno al desarrollo de equipo electrico para anemometria.

  17. Automatic system for driving probes of electron cyclotron

    International Nuclear Information System (INIS)

    The automatic system for driving six probes used on electron model of the ring cyclotron is described. This system allows one to move probes one by one or simultaneously. The active forcing of the process of switching on of the current in phase windings is used a driving scheme of step-motors. The shift of probes from one radius to other can be carried out both from the front panel of driving device (autonomous regime), and from the computer

  18. Semi-automatic Story Creation System in Ubiquitous Sensor Environment

    Science.gov (United States)

    Yoshioka, Shohei; Hirano, Yasushi; Kajita, Shoji; Mase, Kenji; Maekawa, Takuya

    This paper proposes an agent system that semi-automatically creates stories about daily events detected by ubiquitous sensors and posts them to a weblog. The story flow is generated from query-answering interaction between sensor room inhabitants and a symbiotic agent. The agent questions the causal relationships among daily events to create the flow of the story. Preliminary experimental results show that the stories created by our system help users understand daily events.

  19. Data collection system of greenhouse corps based on micro automated guided vehicle%基于微型自动导引运输车的盆栽作物数据采集系统

    Institute of Scientific and Technical Information of China (English)

    王立舒; 丁晓成; 时启凡

    2014-01-01

    orientation unit, was used to automatic navigate and pinpoint the location of samples. S3C6410 chip was use as the core processor of the control unit in micro AGV, S3C6410 is common RSIC processor developed by Samsung Company based on ARM1176JZF-S core and 16/32, which met the data processing requirements. ASLONG GA20Y180 micro direct current motor was used as the drive of the action unit, and achieved control of the motor L293D-based control module. Optical guided navigation was used to the guiding unit, which achieved reliable navigation through two micro AGV navigation modules. By RFID and optical recognition two kinds of ways, the orientation unit achieved targeting and accurate positioning of the Micro AGV during movement. The VDAS, made up of data acquisition units of image and environment as well as data processing unit, was used to collect data of samples’ images, environmental humidity and temperature, carbon dioxide intensity, illumination intensity, and then to process and store the collected data. The communication and control system, made up of vehicle communication unit, and control software on remote control computer, was used to realize long distance transmission and control. When collecting the sample’s data, the control software sent orders and the micro AGV carrying VDAS began to collect images and environmental parameters according to the planned routine. In order to validate the accuracy and stability of the DCS, taking soybean pot as sample in this paper, experiments on image and environmental data acquisition was done. It turned out that the images obtained from the DSC were evenly in good quality which met the requirements of image processing in the later period. Besides, the errors between the automatically collected environmental data and manual data were at around 2%, which met the precision standards of data acquisition. The DCS operated stably during the experiments and phenomenon of out of routine didn't occur. The error of orientation was

  20. Component fragilities. Data collection, analysis and interpretation

    International Nuclear Information System (INIS)

    As part of the component fragility research program sponsored by the US NRC, BNL is involved in establishing seismic fragility levels for various nuclear power plant equipment with emphasis on electrical equipment. To date, BNL has reviewed approximately seventy test reports to collect fragility or high level test data for switchgears, motor control centers and similar electrical cabinets, valve actuators and numerous electrical and control devices, e.g., switches, transmitters, potentiometers, indicators, relays, etc., of various manufacturers and models. BNL has also obtained test data from EPRI/ANCO. Analysis of the collected data reveals that fragility levels can best be described by a group of curves corresponding to various failure modes. The lower bound curve indicates the initiation of malfunctioning or structural damage, whereas the upper bound curve corresponds to overall failure of the equipment based on known failure modes occurring separately or interactively. For some components, the upper and lower bound fragility levels are observed to vary appreciably depending upon the manufacturers and models. For some devices, testing even at the shake table vibration limit does not exhibit any failure. Failure of a relay is observed to be a frequent cause of failure of an electrical panel or a system. An extensive amount of additional fregility or high level test data exists

  1. A Computerized Data-Base System for Land-Use and Land-Cover Data Collected at Ground-Water Sampling Sites in the Pilot National Water Quality Assessment Program

    Science.gov (United States)

    Scott, Jonathon C.

    1989-01-01

    Data-base software has been developed for the management of land-use and land-cover data collected by the U.S. Geological Survey as part of a pilot program to test and refine concepts for a National Water-Quality Assessment Program. This report describes the purpose, use, and design of the land-use and land-cover data-base software. The software provides capabilities for interactive storage and retrieval of land-use and land-cover data collected at ground-water sampling sites. Users of the software can add, update, and delete land-use and land-cover data. The software also provides capabilities to group, print, and summarize the data. The land-use and land-cover data-base software supports multiple data-base systems so that data can be accessed by persons in different offices. Data-base systems are organized in a tiered structure. Each data-base system contains all the data stored in the data-base systems located in the lower tiers of the structure. Data can be readily transmitted from lower tiers to high tiers of the structure. Therefore, the data-base system at the highest tier of the structure contains land-use and land-cover data for the entire pilot program.

  2. SeaBuoySoft – an On-line Automated Windows based Ocean Wave height Data Acquisition and Analysis System for Coastal Field’s Data Collection

    Directory of Open Access Journals (Sweden)

    P.H.Tarudkar

    2014-12-01

    Full Text Available Measurement of various hydraulic parameters such as wave heights for the research and the practical purpose in the coastal fields is one of the critical and challenging but equally important criteria in the field of ocean engineering for the design and the development of hydraulic structures such as construction of sea walls, break waters, oil jetties, fisheries harbors, all other structures, and the ships maneuvering, embankments, berthing on jetties. This paper elucidates the development of “SeaBuoySoft online software system for coastal field‟s wave height data collection” for the coastal application work. The system could be installed along with the associated hardware such as a Digital Waverider Receiver unit and a Waverider Buoy at the shore. The ocean wave height data, transmitted by wave rider buoy installed in the shallow/offshore waters of sea is received by the digital waverider receiver unit and it is interfaced to the SeaBuoySoft software. The design and development of the software system has been worked out in-house at Central Water and Power Research Station, Pune, India. The software has been developed as a Windows based standalone version and is unique of its kind for the reception of real time ocean wave height data, it takes care of its local storage of wave height data for its further analysis work as and when required. The system acquires real time ocean wave height data round the clock requiring no operator intervention during data acquisition process on site.

  3. From Automatic to Adaptive Data Acquisition:- towards scientific sensornets

    OpenAIRE

    Chang, Marcus

    2009-01-01

    Sensornets have been used for ecological monitoring the past decade, yetthe main driving force behind these deployments are still computer scien-tists. The denser sampling and added modalities oered by sensornets coulddrive these elds in new directions, but not until the domain scientists be-come familiar with sensornets and use them as any other instrument in theirtoolbox.We explore three dierent directions in which sensornets can become easierto deploy, collect data of higher quality, and o...

  4. Collecting battery data with Open Battery

    OpenAIRE

    Jones, Gareth L.; Harrison, Peter G.

    2012-01-01

    In this paper we present Open Battery, a tool for collecting data on mobile phone battery usage, describe the data we have collected so far and make some observations. We then introduce the fluid queue model which we hope may prove a useful tool in future work to describe mobile phone battery traces.

  5. The Transformed Civil Rights Data Collection (CRDC)

    Science.gov (United States)

    Office for Civil Rights, US Department of Education, 2012

    2012-01-01

    Since 1968, the Civil Rights Data Collection (CRDC) has collected data on key education and civil rights issues in our nation's public schools for use by the Department of Education's Office for Civil Rights (OCR), other Department offices, other federal agencies, and by policymakers and researchers outside of the Department. The CRDC has…

  6. 34 CFR 303.176 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.176 Section 303.176 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... Data collection. Each application must include procedures that meet the requirements in §...

  7. 5 CFR 890.1307 - Data collection.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Data collection. 890.1307 Section 890.1307 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS... Program Demonstration Project § 890.1307 Data collection. Each carrier will compile, maintain, and...

  8. Methods for Using Ground-Water Model Predictions to Guide Hydrogeologic Data Collection, with Applications to the Death Valley Regional Ground-Water Flow System

    Energy Technology Data Exchange (ETDEWEB)

    Claire R. Tiedeman; M.C. Hill; F.A. D' Agnese; C.C. Faunt

    2001-07-31

    Calibrated models of ground-water systems can provide substantial information for guiding data collection. This work considers using such models to guide hydrogeologic data collection for improving model predictions, by identifying model parameters that are most important to the predictions. Identification of these important parameters can help guide collection of field data about parameter values and associated flow-system features that can lead to improved predictions. Methods for identifying parameters important to predictions include prediction scaled sensitivities (PSS), which account for uncertainty on individual parameters as well as prediction sensitivity to parameters, and a new ''value of improved information'' (VOII) method, which includes the effects of parameter correlation in addition to individual parameter uncertainty and prediction sensitivity. The PSS and VOII methods are demonstrated using a model of the Death Valley regional ground-water flow system. The predictions of interest are advective-transport paths originating at sites of past underground nuclear testing. Results show that for two paths evaluated, the most important parameters include a subset of five or six of the 23 defined model parameters. Some of the parameters identified as most important are associated with flow-system attributes that do not lie in the immediate vicinity of the paths. Results also indicate that the PSS and VOII methods can identify different important parameters. Because the methods emphasize somewhat different criteria for parameter importance, it is suggested that parameters identified by both methods be carefully considered in subsequent data collection efforts aimed at improving model predictions.

  9. Waste collection systems for recyclables

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Merrild, Hanna Kristina; Møller, Jacob;

    2010-01-01

    and technical limitations are respected, and what will the environmental and economic consequences be? This was investigated in a case study of a municipal waste management system. Five scenarios with alternative collection systems for recyclables (paper, glass, metal and plastic packaging) were assessed...... by means of a life cycle assessment and an assessment of the municipality's costs. Kerbside collection would provide the highest recycling rate, 31% compared to 25% in the baseline scenario, but bring schemes with drop-off containers would also be a reasonable solution. Collection of recyclables...... at recycling centres was not recommendable because the recycling rate would decrease to 20%. In general, the results showed that enhancing recycling and avoiding incineration was recommendable because the environmental performance was improved in several impact categories. The municipal costs for collection...

  10. 4D measurement system for automatic location of anatomical structures

    Science.gov (United States)

    Witkowski, Marcin; Sitnik, Robert; Kujawińska, Małgorzata; Rapp, Walter; Kowalski, Marcin; Haex, Bart; Mooshake, Sven

    2006-04-01

    Orthopedics and neurosciences are fields of medicine where the analysis of objective movement parameters is extremely important for clinical diagnosis. Moreover, as there are significant differences between static and dynamic parameters, there is a strong need of analyzing the anatomical structures under functional conditions. In clinical gait analysis the benefits of kinematical methods are undoubted. In this paper we present a 4D (3D + time) measurement system capable of automatic location of selected anatomical structures by locating and tracing the structures' position and orientation in time. The presented system is designed to help a general practitioner in diagnosing selected lower limbs' dysfunctions (e.g. knee injuries) and also determine if a patient should be directed for further examination (e.g. x-ray or MRI). The measurement system components are hardware and software. For the hardware part we adapt the laser triangulation method. In this way we can evaluate functional and dynamic movements in a contact-free, non-invasive way, without the use of potentially harmful radiation. Furthermore, opposite to marker-based video-tracking systems, no preparation time is required. The software part consists of a data acquisition module, an image processing and point clouds (point cloud, set of points described by coordinates (x, y, z)) calculation module, a preliminary processing module, a feature-searching module and an external biomechanical module. The paper briefly presents the modules mentioned above with the focus on the feature-searching module. Also we present some measurement and analysis results. These include: parameters maps, landmarks trajectories in time sequence and animation of a simplified model of lower limbs.

  11. TASK OF FUNCTIONING ALGORITHM SYNTHESIS OF AUTOMATIC MANAGEMENT SYSTEM OF TRAFFIC MOVEMENT IN METROPOLITAN AREAS

    Directory of Open Access Journals (Sweden)

    E. Getsovich

    2009-01-01

    Full Text Available The concept and structure of automatic management system of traffic movement in metropolitan areas is proposed by the authors, where the parameters of road network and efficient information on traffic streams are used as primary data as well as empirical-stochastic approach to simulation and forecast of development of situation.

  12. Automatic Vehicle Speed Reduction System Using Rf Technology

    Directory of Open Access Journals (Sweden)

    Deepa B Chavan

    2014-04-01

    Full Text Available For vehicle safety and safety for passengers in vehicle is an important parameter. Most of the vehicles get accident because no proper safety measures are taken especially at curves and hair pin bends humps and any obstacles in front of the vehicle. This system can be used for the prevention of such a problem by indicating a pre indication and also reducing the speed of vehicles by reducing the fuel rate of vehicle. As the action is in terms of fuel rate so the vehicle automatically goes to control and avoids the accidents. At curves and hair pin bends the line of sight is not possible for the drivers so the special kind of transmitter which is tuned at a frequency of 433MHZ are mounted as these transmitters continuously radiate a RF signal for some particular area. As the vehicle come within this radiation the receiver in the vehicle gets activate. The transmitter used here is a coded transmitter which is encoded with encoder. The encoder provides a 4 bit binary data which is serially transmitted to transmitter. The transmitter used here is ASK type (amplitude shift keying which emits the RF radiation.

  13. Proceedings of the workshop on reliability data collection

    International Nuclear Information System (INIS)

    The main purpose of the Workshop was to provide a forum for exchanging information and experience on Reliability Data Collection and analysis to support Living Probabilistic Safety Assessments (LPSA). The Workshop is divided into four sessions which titles are: Session 1: Reliability Data - Database Systems (3 papers), Session 2: Reliability Data Collection for PSA (5 papers), Session 3: NPP Data Collection (3 papers), Session 4: Reliability Data Assessment (Part 1: General - 2 papers; Part 2: CCF - 2 papers; Part 3: Reactor Protection Systems / External Event Data - 2 papers; Part 4: Human Errors - 2 papers)

  14. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per;

    2016-01-01

    We present a fully automatic online video system, which is able to detect the behaviour of honeybees at the beehive entrance. Our monitoring system focuses on observing the honeybees as naturally as possible (i.e. without disturbing the honeybees). It is based on the Raspberry Pi that is a low...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  15. Intelligent E-Learning Systems: Automatic Construction of Ontologies

    Science.gov (United States)

    Peso, Jesús del; de Arriaga, Fernando

    2008-05-01

    During the last years a new generation of Intelligent E-Learning Systems (ILS) has emerged with enhanced functionality due, mainly, to influences from Distributed Artificial Intelligence, to the use of cognitive modelling, to the extensive use of the Internet, and to new educational ideas such as the student-centered education and Knowledge Management. The automatic construction of ontologies provides means of automatically updating the knowledge bases of their respective ILS, and of increasing their interoperability and communication among them, sharing the same ontology. The paper presents a new approach, able to produce ontologies from a small number of documents such as those obtained from the Internet, without the assistance of large corpora, by using simple syntactic rules and some semantic information. The method is independent of the natural language used. The use of a multi-agent system increases the flexibility and capability of the method. Although the method can be easily improved, the results so far obtained, are promising.

  16. The use of the Global Positioning System for real-time data collecting during ecological aerial surveys in the Kruger National Park, South Africa

    Directory of Open Access Journals (Sweden)

    P.C. Viljoen

    1994-09-01

    Full Text Available The use of the Global Positioning System (GPS for real-time data collecting during ecological aerial surveys (EAS in the Kruger National Park (KNP was investigated as an alternative to post-survey manual data capture. Results obtained during an aerial census of large herbivores and surface water distribution in the northern part of the KNP using an onboard GPS connected to a palmtop computer are discussed. This relatively inexpensive system proved to be highly efficient for real-time data capture while additional information such as ground velocity and time can be recorded for every data point. Measures of distances between a ground marker and fix points measured during a flight (x = 60.0 m are considered to be well within the requirements of the EAS.

  17. A centralised remote data collection system using automated traps for managing and controlling the population of the Mediterranean (Ceratitis capitata) and olive (Dacus oleae) fruit flies

    Science.gov (United States)

    Philimis, Panayiotis; Psimolophitis, Elias; Hadjiyiannis, Stavros; Giusti, Alessandro; Perelló, Josep; Serrat, Albert; Avila, Pedro

    2013-08-01

    The present paper describes the development of a novel monitoring system (e-FlyWatch system) for managing and controlling the population of two of the world's most destructive fruit pests, namely the olive fruit fly (Bactrocera oleae, Rossi - formerly Dacus oleae) and the Mediterranean fruit fly (Ceratitis capitata, also called medfly). The novel monitoring system consists of a) novel automated traps with optical and motion detection modules for capturing the flies, b) local stations including a GSM/GPRS module, sensors, flash memory, battery, antenna etc. and c) a central station that collects, stores and publishes the results (i.e. insect population in each field, sensor data, possible error/alarm data) via a web-based management software.The centralised data collection system provides also analysis and prediction models, end-user warning modules and historical analysis of infested areas. The e-FlyWatch system enables the SMEs-producers in the Fruit, Vegetable and Olive sectors to improve their production reduce the amount of insecticides/pesticides used and consequently the labour cost for spraying activities, and the labour cost for traps inspection.

  18. Learning Semantic Concepts from Noisy Media Collection for Automatic Image Annotation

    Institute of Scientific and Technical Information of China (English)

    TIAN Feng; SHEN Xukun

    2015-01-01

    — Along with the explosive growth of im-ages, automatic image annotation has attracted great in-terest of various research communities. However, despite the great progress achieved in the past two decades, au-tomatic annotation is still an important open problem in computer vision, and can hardly achieve satisfactory per-formance in real-world environment. In this paper, we ad-dress the problem of annotation when noise is interfering with the dataset. A semantic neighborhood learning model on noisy media collection is proposed. Missing labels are replenished, and semantic balanced neighborhood is con-struct. The model allows the integration of multiple la-bel metric learning and local nonnegative sparse coding. We construct semantic consistent neighborhood for each sample, thus corresponding neighbors have higher global similarity, partial correlation, conceptual similarity along with semantic balance. Meanwhile, an iterative denoising method is also proposed. The method proposed makes a marked improvement as compared to the current state-of-the-art.

  19. EOS Data Dumper: an Automatic Downloading and Re-Distributing System for Free EOS Data%EOS Data Dumper——EOS免费数据自动下载与重发布系统

    Institute of Scientific and Technical Information of China (English)

    南卓铜; 王亮绪; 李新

    2007-01-01

    为了更有效的利用已有数据资源, 不造成科研设施的重复投资, 数据共享越来越受到重视. NASA对地观测系统(EOS)提供了大量的包括MODIS在内的免费数据资源, 为此, EOS Data Dumper(EDD)通过程序模拟EOS数据门户的正常下载流程, 采用了先进的Web页面文本信息捕捉技术, 实现定时自动下载研究区的全部EOS免费数据, 并通过免费的DIAL系统, 向互联网重新发布, 实现复杂的基于时空的数据查询. 从技术角度详细介绍了EDD的项目背景与意义、实现方案、涉及的关键技术等.

  20. ACRF Data Collection and Processing Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, M; Egan, D

    2004-12-01

    We present a description of the data flow from measurement to long-term archive. We also discuss data communications infrastructure. The data handling processes presented include collection, transfer, ingest, quality control, creation of Value-Added Products (VAP), and data archiving.

  1. Automatic Thermal Control System with Temperature Difference or Derivation Feedback

    Directory of Open Access Journals (Sweden)

    Darina Matiskova

    2016-02-01

    Full Text Available Automatic thermal control systems seem to be non-linear systems with thermal inertias and time delay. A controller is also non-linear because its information and power signals are limited. The application of methods that are available to on-linear systems together with computer simulation and mathematical modelling creates a possibility to acquire important information about the researched system. This paper provides a new look at the heated system model and also designs the structure of the thermal system with temperature derivation feedback. The designed system was simulated by using a special software in Turbo Pascal. Time responses of this system are compared to responses of a conventional thermal system. The thermal system with temperature derivation feedback provides better transients, better quality of regulation and better dynamical properties.

  2. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  3. Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data

    Science.gov (United States)

    Hong, J. H.; Su, Y. T.

    2016-06-01

    With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  4. Data Collection Satellite Application in Precision Agriculture

    Science.gov (United States)

    Durào, O.

    2002-01-01

    Agricultural Instrumentation Research Center, Brazilian Agricultural Research Corporation; Space Programs Brazil launched in 1993 its first satellite partially built and entirely designed, integrated, tested and operated in the country. It was the SCD-1 satellite, a small (115 kg. and an octagonal prism with 80 cm. height and an external diameter of 100 cm.) with a payload transponder that receives data from ground platforms spread all over the country (including its sea shore). These data are then retransmitted to a receiving station at every satellite pass. Data collected and received are processed at Data Collection Mission Center for distribution via internet at most 30 min after the satellite pass. The ground platforms are called PCD's and differ in the parameters measured according to its purpose and location. Thus, they are able to measure temperature, rain level, wind direction, solar radiation, carbon monoxide as well as many others, beyond its own location. SCD- 1 had a nominal designed life of one year, but is still functioning. It is a LEO satellite with inclination of 25°. In 1998, the country launched SCD-2, with the same purpose, but in phase with SCD-1 . Other differences were a higher index of Brazilian made components and an active attitude control subsystem for the spin rate provided by the magnetic torque coils (these in accordance with a development strategy previously planned). In 1999 the country launched in cooperation with China a remote sensing satellite (mass of 1.4 ton.) called CBERS-1. This satellite is sun synchronous (98° inclination) and also carries a transponder for data collection/transmission as a secondary payload. Thus, the country has now three satellites with data collection/transmission capabilities, two in low inclination phased orbits and one in polar orbit, providing a nice coverage both geographical and temporal not only to its territory but also to other regions of the world.. At first there were not too many PCD

  5. Design of Management Information System for Metro Automatic Fare Collection%自动售检票系统中车站信息管理系统的研究与设计

    Institute of Scientific and Technical Information of China (English)

    徐炜炜; 徐骏善; 叶飞

    2012-01-01

    介绍了AFC(自动售检票系统)车站管理信息系统的体系结构和主要功能.该系统采用表示层、业务逻辑层和数据传输层的三层结构模型,具有票务管理、钱票箱库存管理、运营管理、系统维护、通信服务等功能.在软件开发中,采用了CORBA(公共对象请求代理体系)中间件技术,使其具有跨软、硬件平台的性能,也方便开发人员分工协作;采用了数据库技术并给出了系统的总体E-R图.%This paper introduces the structure and key features of AFC station management information system (AS-MIS), which is designed in a three-tier structure: presentation, business logic and data- transmission, possessing the functions like management of ticketing affairs, inventory of ticket/cash box, operation control, system maintenance and communication service, etc. In the design, CORBA middleware technology is adopted to ensure a good performance of the system on various software or hardware platforms, and facilitate the developers'cooperation, a general entity-relationship (E-R) Diagram is obtained by a-dopting the database technology.

  6. Collective flow in small systems

    International Nuclear Information System (INIS)

    The large density of matter in the interaction region of the proton–nucleus or deuteron–nucleus collisions enables the collective expansion of the fireball. Predictions of a hydrodynamic model for the asymmetric transverse flow are presented and compared to experimental data

  7. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database and or...

  8. Automatic Building Extraction From LIDAR Data Covering Complex Urban Scenes

    Science.gov (United States)

    Awrangjeb, M.; Lu, G.; Fraser, C.

    2014-08-01

    This paper presents a new method for segmentation of LIDAR point cloud data for automatic building extraction. Using the ground height from a DEM (Digital Elevation Model), the non-ground points (mainly buildings and trees) are separated from the ground points. Points on walls are removed from the set of non-ground points by applying the following two approaches: If a plane fitted at a point and its neighbourhood is perpendicular to a fictitious horizontal plane, then this point is designated as a wall point. When LIDAR points are projected on a dense grid, points within a narrow area close to an imaginary vertical line on the wall should fall into the same grid cell. If three or more points fall into the same cell, then the intermediate points are removed as wall points. The remaining non-ground points are then divided into clusters based on height and local neighbourhood. One or more clusters are initialised based on the maximum height of the points and then each cluster is extended by applying height and neighbourhood constraints. Planar roof segments are extracted from each cluster of points following a region-growing technique. Planes are initialised using coplanar points as seed points and then grown using plane compatibility tests. If the estimated height of a point is similar to its LIDAR generated height, or if its normal distance to a plane is within a predefined limit, then the point is added to the plane. Once all the planar segments are extracted, the common points between the neghbouring planes are assigned to the appropriate planes based on the plane intersection line, locality and the angle between the normal at a common point and the corresponding plane. A rule-based procedure is applied to remove tree planes which are small in size and randomly oriented. The neighbouring planes are then merged to obtain individual building boundaries, which are regularised based on long line segments. Experimental results on ISPRS benchmark data sets show that the

  9. Application of MintDrive Automatic Precision Positioning System

    Institute of Scientific and Technical Information of China (English)

    Wu Fengming; Yang Yonggang; Zhao Xiaolong; Zhang Zhiyuan

    2004-01-01

    It is very important to locate batteries accurately and quickly during automatic battery production.Unstable or inaccurate location will negatively influence battery's consistency, quality and finished product rate.A traditional way is using sensor to detect and locate batteries directly , but because of the detecting tolerance, setting them on a fixed point exactly is almost impossible.This problem could be completely solved by the application of mint drive automatic accurate servo locating system.Firstly operating software WorkBench test was applied to collocate the servo locating driver for a most optimized control.Then based on the requirement of real location, program and test the locating action with a programming software and finally upload all the locating information to MicroLogix 1200 PLC, the PLC will control the running on each station telling when to locate, where is the location and how to eliminate bad parts.For this intelligent servo locating system has the advantages of powerful function, simple operation, high controlling and locating accuracy and easy maintenance, it is very suitable to be adopted by automatic battery making line.It is regarded as a very advanced method of control currently for reducing waste material due to inaccurate location and tough adjustment.

  10. Automatic graphene transfer system for improved material quality and efficiency

    Science.gov (United States)

    Boscá, Alberto; Pedrós, Jorge; Martínez, Javier; Palacios, Tomás; Calle, Fernando

    2016-02-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The process is based on the all-fluidic manipulation of the graphene to avoid mechanical damage, strain and contamination, and on the combination of capillary action and electrostatic repulsion between the graphene and its container to ensure a centered sample on top of the target substrate. The improved carrier mobility and yield of the automatically transferred graphene, as compared to that manually transferred, is demonstrated by the optical and electrical characterization of field-effect transistors fabricated on both materials. In particular, 70% higher mobility values, with a 30% decrease in the unintentional doping and a 10% strain reduction are achieved. The system has been developed for lab-scale transfer and proved to be scalable for industrial applications.

  11. Collective dynamics of multicellular systems

    Indian Academy of Sciences (India)

    R Maithreye; C Suguna; Somdatta Sinha

    2011-11-01

    We have studied the collective behaviour of a one-dimensional ring of cells for conditions when the individual uncoupled cells show stable, bistable and oscillatory dynamics. We show that the global dynamics of this model multicellular system depends on the system size, coupling strength and the intrinsic dynamics of the cells. The intrinsic variability in dynamics of the constituent cells are suppressed to stable dynamics, or modified to intermittency under different conditions. This simple model study reveals that cell–cell communication, system size and intrinsic cellular dynamics can lead to evolution of collective dynamics in structured multicellular biological systems that is significantly different from its constituent single-cell behaviour.

  12. Laplace domain automatic data assimilation of contaminant transport using a Wireless Sensor Network

    Science.gov (United States)

    Barnhart, K.; Illangasekare, T. H.

    2011-12-01

    Emerging in situ sensors and distributed network technologies have the potential to monitor dynamic hydrological and environmental processes more effectively than traditional monitoring and data acquisition techniques by sampling at greater spatial and temporal resolutions. In particular, Wireless Sensor Networks, the combination of low-power telemetry and energy-harvesting with miniaturized sensors, could play a large role in monitoring the environment on nature's time scale. Since sensor networks supply data with little or no delay, applications exist where automatic or real-time assimilation of this data would be useful, for example during smart remediation procedures where tracking of the plume response will reinforce real-time decisions. As a foray into this new data context, we consider the estimation of hydraulic conductivity when incorporating subsurface plume concentration data. Current practice optimizes the model in the time domain, which is often slow and overly sensitive to data anomalies. Instead, we perform model inversion in Laplace space and are able to do so because data gathered using new technologies can be sampled densely in time. An intermediate-scale synthetic aquifer is used to illustrate the developed technique. Data collection and model (re-)optimization are automatic. Electric conductivity values of passing sodium bromide plumes are sent through a wireless sensor network, stored in a database, scrubbed and passed to a modeling server which transforms the data and assimilates it into a Laplace domain model. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000

  13. Effect of an automatic feeding system on growth performance and feeding behaviour of pigs reared outdoors

    Directory of Open Access Journals (Sweden)

    Riccardo Fortina

    2010-01-01

    Full Text Available Nine Mora Romagnola and 10 Large White x Mora Romagnola growing pigs were reared outdoors. In both groups ad libitum feed was provided. Conventional pigs received it twice a day, distributed in two long troughs. Inside the corral of the second group, an automatic station was set up for: feed distribution, pigs weighing, and control by an analog camera. Thus the self-feeders received feed ad libitum individually by the automatic system, divided into small quantities at meal times. During the experiment the analog camera was used over 24 hours each day, to collect pictures of pigs in order to investigate their behaviours. For each picture the day and hour, the number of visible pigs and their behaviours were recorded and a statistical analysis of data, which was expressed as hourly frequencies of behavioural elements, was performed. Moreover to highlight “active” and “passive” behaviours between the groups, two categories “Move” and “Rest” were created grouping some behavioural elements. With regard to performance, conventional pigs reached a higher total weight gain (56.1±2.42 kg vs 46.7±2.42 kg; P=0.0117. But the feed conversion index (FCI of both groups was similar. The self-feeders had consumed less feed than conventional animals. The feeding system seems to influence behaviours. The percentage of time spent in Eating activity differs (P<0.0001 between the self-fed (median 24.6% and conventional pigs (median 10.9%. The resulting more regular eating trend of self-feeders influenced the daily activities distribution. The behavioural category Rest (median: self-feeders 55.0% vs 71.4% conventional pigs was dominant, with conventional pigs becoming more restless, particularly at meal times. This type of feeding competition and aggressive behaviour did not happen in the self-feeders due to the feed distribution system. The self-feeder results showed that pigs eat at the automatic station both day and night. The animals perform on

  14. Automatic diagnosis of pathological myopia from heterogeneous biomedical data.

    Directory of Open Access Journals (Sweden)

    Zhuo Zhang

    Full Text Available Pathological myopia is one of the leading causes of blindness worldwide. The condition is particularly prevalent in Asia. Unlike myopia, pathological myopia is accompanied by degenerative changes in the retina, which if left untreated can lead to irrecoverable vision loss. The accurate diagnosis of pathological myopia will enable timely intervention and facilitate better disease management to slow down the progression of the disease. Current methods of assessment typically consider only one type of data, such as that from retinal imaging. However, different kinds of data, including that of genetic, demographic and clinical information, may contain different and independent information, which can provide different perspectives on the visually observable, genetic or environmental mechanisms for the disease. The combination of these potentially complementary pieces of information can enhance the understanding of the disease, providing a holistic appreciation of the multiple risks factors as well as improving the detection outcomes. In this study, we propose a computer-aided diagnosis framework for Pathological Myopia diagnosis through Biomedical and Image Informatics(PM-BMII. Through the use of multiple kernel learning (MKL methods, PM-BMII intelligently fuses heterogeneous biomedical information to improve the accuracy of disease diagnosis. Data from 2,258 subjects of a population-based study, in which demographic and clinical information, retinal fundus imaging data and genotyping data were collected, are used to evaluate the proposed framework. The experimental results show that PM-BMII achieves an AUC of 0.888, outperforming the detection results from the use of demographic and clinical information 0.607 (increase 46.3%, p<0.005, genotyping data 0.774 (increase 14.7%, P<0.005 or imaging data 0.852 (increase 4.2%, p=0.19 alone. The accuracy of the results obtained demonstrates the feasibility of using heterogeneous data for improved disease

  15. Observer Manual and Current Data Collection Forms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Observer Program web page that lists the observer field manual and all current data collection forms that observers are required to take out to sea.

  16. Automatic analysis of eye tracker data from a driving simulator

    OpenAIRE

    Bergstrand, Martin

    2008-01-01

    The movement of a persons eyes is an interesting factor to study in different research areas where attention is important, for example driving. In 2004 the Swedish national road and transport research institute (VTI) introduced Simulator III – their third generation of driving simulators. Inside Simulator III a camera based eye tracking system is installed that records the eye movements of the driver. To be useful, the raw data from the eye tracking system needs to be analyzed and concentrate...

  17. Proportional directional valve based automatic steering system for tractors

    Institute of Scientific and Technical Information of China (English)

    Jin-yi LIU; Jing-quan TAN; En-rong MAO; Zheng-he SONG; Zhong-xiang ZHU‡

    2016-01-01

    Most automatic steering systems for large tractors are designed with hydraulic systems that run on either constant flow or constant pressure. Such designs are limited in adaptability and applicability. Moreover, their control valves can unload in the neutral position and eventually lead to serious hydraulic leakage over long operation periods. In response to the problems noted above, a multifunctional automatic hydraulic steering circuit is presented. The system design is composed of a 5-way-3-position proportional directional valve, two pilot-controlled check valves, a pressure-compensated directional valve, a pressure-compensated flow regulator valve, a load shuttle valve, and a check valve, among other components. It is adaptable to most open-center systems with constant flow supply and closed-center systems with load feedback. The design maintains the lowest pressure under load feedback and stays at the neutral position during unloading, thus meeting the requirements for steering. The steering controller is based on proportional-integral-derivative (PID) running on a 51-microcontroller-unit master control chip. An experimental platform is developed to establish the basic characteristics of the system subject to stepwise inputs and sinusoi-dal tracking. Test results show that the system design demonstrates excellent control accuracy, fast response, and negligible leak during long operation periods.

  18. Design of automatic thruster assisted mooring systems for ships

    Directory of Open Access Journals (Sweden)

    Jan P. Strand

    1998-04-01

    Full Text Available This paper addresses the mathematical modelling and controller design of an automatic thruster assisted position mooring system. Such control systems are applied to anchored floating production offloading and storage vessels and semi-subs. The controller is designed using model based control with a LQG feedback controller in conjunction with a Kalman filter. The controller design is in addition to the environmental loads accounting for the mooring forces acting on the vessel. This is reflected in the model structure and in the inclusion of new functionality.

  19. Automatic System for Serving and Deploying Products into Advertising Space

    OpenAIRE

    Lepen, Nejc

    2014-01-01

    The purpose of the thesis is to present the problems of deploying and serving products into advertising space,encountered daily by online marketers,planners and leaseholders of advertising spaces.The aim of the thesis is to solve the problem in question with the help of a novel web application.Therefore,we have designed an automatic system,which consists of three key components:an online store,a surveillance system and websites accommodating advertising space.In the course of this thesis,we h...

  20. Collection of arc welding process data

    OpenAIRE

    K. Luksa; Z. Rymarski

    2006-01-01

    Purpose: The aim of the research was to examine the possibility of detecting welding imperfections by recording the instant values of welding parameters. The microprocessor controlled system for real-time collection and display of welding parameters was designed, implemented and tested.Design/methodology/approach: The system records up to 4 digital or analog signals collected from welding process and displays their run on the LCD display. To disturb the welding process artificial disturbances...

  1. Cloud-Based Smart Health Monitoring System for Automatic Cardiovascular and Fall Risk Assessment in Hypertensive Patients.

    Science.gov (United States)

    Melillo, P; Orrico, A; Scala, P; Crispino, F; Pecchia, L

    2015-10-01

    The aim of this paper is to describe the design and the preliminary validation of a platform developed to collect and automatically analyze biomedical signals for risk assessment of vascular events and falls in hypertensive patients. This m-health platform, based on cloud computing, was designed to be flexible, extensible, and transparent, and to provide proactive remote monitoring via data-mining functionalities. A retrospective study was conducted to train and test the platform. The developed system was able to predict a future vascular event within the next 12 months with an accuracy rate of 84 % and to identify fallers with an accuracy rate of 72 %. In an ongoing prospective trial, almost all the recruited patients accepted favorably the system with a limited rate of inadherences causing data losses (risk assessment of vascular events and falls. PMID:26276015

  2. Automatic Measurement of Radioactive Deposition: a New On-Line System in Slovenian Radiation Monitoring Network

    International Nuclear Information System (INIS)

    Full text: The automatic radiation-monitoring network in Slovenia consists of four different on-line systems: external gamma radiation network, aerosol measuring stations, a continuous radon monitor, and a radioactive deposition measuring system (RDMS). The latest system became operational in October 1999. Since June 2000, the results have been continuously presented on the World Wide Web. The system is designed for on-line detection and evaluation of possible radioactive contamination with artificial radionuclides, such as fission products 131I , 137Cs and others. Once surface-specific activities of individual radionuclides are determined, it is possible to promptly make dose projections for the population due to ingestion of food and drinking water. The measuring system and data analysis method are the results of SNSA's own development. The RMDS is equipped with a 3'x3' NaI(Tl) scintillation detector, which is mounted in a thermostatic housing. The system collects data and performs a gamma-spectroscopic analysis every 6 hours. The measurement time interval can be easily changed. Special software enables an on-line evaluation, display and storage of the results of surface ground contamination. Natural short-lived radon decay products (gamma emitters 214Pb and 214Bi) washed-out from the atmosphere by precipitation are recorded occasionally. The decay of these gamma-emitting radionuclides considerably contributes to the natural background radiation levels. Surface-specific activities of the deposited radon daughters is in accordance with the increase in dose rate measured with gamma probes. The RMDS has proved to be a reliable and very sensitive system for measuring contamination with gamma emitters deposited on the ground. In case of a nuclear or radiological accident it gives valuable information for proper decision making. (author)

  3. 30 CFR 75.1103-6 - Automatic fire sensors; actuation of fire suppression systems.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensors; actuation of fire... Protection § 75.1103-6 Automatic fire sensors; actuation of fire suppression systems. Point-type heat sensors or automatic fire sensor and warning device systems may be used to actuate deluge-type water...

  4. 30 CFR 75.1103-3 - Automatic fire sensor and warning device systems; minimum requirements; general.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensor and warning device...-UNDERGROUND COAL MINES Fire Protection § 75.1103-3 Automatic fire sensor and warning device systems; minimum requirements; general. Automatic fire sensor and warning device systems installed in belt haulageways...

  5. "Sorpvej" for Sorption Curves - A Windows Program for collecting Weighing Data and determining Equilibrium State

    DEFF Research Database (Denmark)

    Strømdahl, Kenneth; Hansen, Kurt Kielsgaard

    1998-01-01

    The Windows program SORPVEJ collects weighing data from the balance and plots points on the sorption curve. The features of the program are: All data are transmitted automatically from the balance to the computer. Each point on the curve (upper right inset)is an original measurement and every time...

  6. Automatic Emboli Detection System for the Artificial Heart

    Science.gov (United States)

    Steifer, T.; Lewandowski, M.; Karwat, P.; Gawlikowski, M.

    In spite of the progress in material engineering and ventricular assist devices construction, thromboembolism remains the most crucial problem in mechanical heart supporting systems. Therefore, the ability to monitor the patient's blood for clot formation should be considered an important factor in development of heart supporting systems. The well-known methods for automatic embolus detection are based on the monitoring of the ultrasound Doppler signal. A working system utilizing ultrasound Doppler is being developed for the purpose of flow estimation and emboli detection in the clinical artificial heart ReligaHeart EXT. Thesystem will be based on the existing dual channel multi-gate Doppler device with RF digital processing. A specially developed clamp-on cannula probe, equipped with 2 - 4 MHz piezoceramic transducers, enables easy system setup. We present the issuesrelated to the development of automatic emboli detection via Doppler measurements. We consider several algorithms for the flow estimation and emboli detection. We discuss their efficiency and confront them with the requirements of our experimental setup. Theoretical considerations are then met with preliminary experimental findings from a) flow studies with blood mimicking fluid and b) in-vitro flow studies with animal blood. Finally, we discuss some more methodological issues - we consider several possible approaches to the problem of verification of the accuracy of the detection system.

  7. Research on HJ-1A/B satellite data automatic geometric precision correction design

    Institute of Scientific and Technical Information of China (English)

    Xiong Wencheng; Shen Wenming; Wang Qiao; Shi Yuanli; Xiao Rulin; Fu Zhuo

    2014-01-01

    Developed independently by China,HJ-1A/B satellites have operated well on-orbit for five years and acquired a large number of high-quality observation data. The realization of the observation data geometric precision correction is of great significance for macro and dynamic ecological environment monitoring. The pa-per analyzed the parameter characteristics of HJ-1 satellite and geometric features of HJ-1 satellite level 2 data (systematic geo-corrected data). Based on this,the overall HJ-1 multi-sensor geometric correction flow and charge-coupled device (CCD) automatic geometric precision correction method were designed. Actual operating data showed that the method could achieve good result for automatic geometric precision correction of HJ-1 sat-ellite data,automatic HJ-1 CCD image geometric precision correction accuracy could be achieved within two pixels and automatic matching accuracy between the images of same satellite could be obtained less than one pixel.

  8. Automatic Radiation Monitoring in Slovenia

    International Nuclear Information System (INIS)

    Full text: The automatic radiation monitoring system in Slovenia started in early nineties and now it comprises measurements of: 1. External gamma radiation: For the time being there are forty-three probes with GM tubes integrated into a common automatic network, operated at the SNSA. The probes measure dose rate in 30 minute intervals. 2. Aerosol radioactivity: Three automatic aerosol stations measure the concentration of artificial alpha and beta activity in the air, gamma emitting radionuclides, radioactive iodine 131 in the air (in all chemical forms, - natural radon and thoron progeny, 3. Radon progeny concentration: Radon progeny concentration is measured hourly and results are displayed as the equilibrium equivalent concentrations (EEC), 4. Radioactive deposition measurements: As a support to gamma dose rate measurements - the SNSA developed and installed an automatic measuring station for surface contamination equipped with gamma spectrometry system (with 3x3' NaI(Tl) detector). All data are transferred through the different communication pathways to the SNSA. They are collected in 30 minute intervals. Within these intervals the central computer analyses and processes the collected data, and creates different reports. Every month QA/QC analysis of data is performed, showing the statistics of acquisition errors and availability of measuring results. All results are promptly available at the our WEB pages. The data are checked and daily sent to the EURDEP system at Ispra (Italy) and also to the Austrian, Croatian and Hungarian authorities. (author)

  9. Automatically Building Diagnostic Bayesian Networks from On-line Data Sources and the SMILE Web-based Interface

    OpenAIRE

    Tungkasthan, Anucha; Jongsawat, Nipat; Poompuang, Pittaya; Intarasema, Sarayut; Premchaiswadi, Wichian

    2010-01-01

    This paper presented a practical framework for automating the building of diagnostic BN models from data sources obtained from the WWW and demonstrates the use of a SMILE web-based interface to represent them. The framework consists of the following components: RSS agent, transformation/conversion tool, core reasoning engine, and the SMILE web-based interface. The RSS agent automatically collects and reads the provided RSS feeds according to the agent's predefined URLs. A transformation/conve...

  10. Block Grants: Federal Data Collection Provisions.

    Science.gov (United States)

    General Accounting Office, Washington, DC. Div. of Human Resources.

    This fact sheet compares statutory data collection and reporting provisions of the federal education block grant (chapter 2 of the Education Consolidation and Improvement Act of 1981) with the nine other block grant programs funded in fiscal year 1986; data on statutory administrative cost limits are also provided. Each grant's legislation was…

  11. Implementation of an electronic fingerprint-linked data collection system: a feasibility and acceptability study among Zambian female sex workers

    OpenAIRE

    Wall, Kristin M; Kilembe, William; Inambao, Mubiana; Chen, Yi No; Mchoongo, Mwaka; Kimaru, Linda; Hammond, Yuna Tiffany; Sharkey, Tyronza; Malama, Kalonde; Fulton, T. Roice; Tran, Alex; Halumamba, Hanzunga; Anderson, Sarah; Kishore, Nishant; Sarwar, Shawn

    2015-01-01

    Background Patient identification within and between health services is an operational challenge in many resource-limited settings. When following HIV risk groups for service provision and in the context of vaccine trials, patient misidentification can harm patient care and bias trial outcomes. Electronic fingerprinting has been proposed to identify patients over time and link patient data between health services. The objective of this study was to determine 1) the feasibility of implementing...

  12. Automatic Feature Detection, Description and Matching from Mobile Laser Scanning Data and Aerial Imagery

    Science.gov (United States)

    Hussnain, Zille; Oude Elberink, Sander; Vosselman, George

    2016-06-01

    In mobile laser scanning systems, the platform's position is measured by GNSS and IMU, which is often not reliable in urban areas. Consequently, derived Mobile Laser Scanning Point Cloud (MLSPC) lacks expected positioning reliability and accuracy. Many of the current solutions are either semi-automatic or unable to achieve pixel level accuracy. We propose an automatic feature extraction method which involves utilizing corresponding aerial images as a reference data set. The proposed method comprise three steps; image feature detection, description and matching between corresponding patches of nadir aerial and MLSPC ortho images. In the data pre-processing step the MLSPC is patch-wise cropped and converted to ortho images. Furthermore, each aerial image patch covering the area of the corresponding MLSPC patch is also cropped from the aerial image. For feature detection, we implemented an adaptive variant of Harris-operator to automatically detect corner feature points on the vertices of road markings. In feature description phase, we used the LATCH binary descriptor, which is robust to data from different sensors. For descriptor matching, we developed an outlier filtering technique, which exploits the arrangements of relative Euclidean-distances and angles between corresponding sets of feature points. We found that the positioning accuracy of the computed correspondence has achieved the pixel level accuracy, where the image resolution is 12cm. Furthermore, the developed approach is reliable when enough road markings are available in the data sets. We conclude that, in urban areas, the developed approach can reliably extract features necessary to improve the MLSPC accuracy to pixel level.

  13. Improvement and automatization of a proportional alpha-beta counting system - FAG

    International Nuclear Information System (INIS)

    An alpha and beta counting system - FAG*, for planchette samples is operated at the Health Physics department's laboratory of the NRCN. The original operation mode of the system was based on manual tasks handled by the FHT1 100 electronics. An option for a basic computer keyboard operation was available too. A computer with an appropriate I/O card was connected to the system and a new operating program was developed which enables full automatic control of the various components. The program includes activity calculations and statistical checks as well as data management. A bar-code laser system for sample number reading was integrated into the Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. authors)

  14. Towards a complete Feynman diagrams automatic computation system

    CERN Document Server

    Perret-Gallix, D

    1995-01-01

    Complete Feynman diagram automatic computation systems are now coming of age after many years of development. They are made available to the high energy physics community through user-friendly interfaces. Theorists and experimentalists can benefit from these powerful packages for speeding up time consuming calculations and for preparing event generators. The general architecture of these packages is presented and the current development of the one-loop diagrams extension is discussed. A rapid description of the prominent packages and tools is then proposed. Finally, the necessity for defining a standardization scheme is heavily stressed for the benefit of developers and users.

  15. Concentrate composition for Automatic Milking Systems - Effect on milking frequency

    DEFF Research Database (Denmark)

    Madsen, J; Weisbjerg, Martin Riis; Hvelplund, Torben

    2010-01-01

    The purpose of this study was to investigate the potential of affecting milking frequency in an Automatic Milking System (AMS) by changing ingredient composition of the concentrate fed in the AMS. In six experiments, six experimental concentrates were tested against a Standard concentrate all...... the Standard concentrate. A marked effect was found on the number of visits of the cows in the AMS and the subsequent milk production in relation to composition of the concentrate. The composition of the concentrates also influenced the composition of the milk and the MR intake. Based on the overall responses...

  16. Robust Fallback Scheme for the Danish Automatic Voltage Control System

    DEFF Research Database (Denmark)

    Qin, Nan; Dmitrova, Evgenia; Lund, Torsten;

    2015-01-01

    This paper proposes a fallback scheme for the Danish automatic voltage control system. It will be activated in case of the local station loses telecommunication to the control center and/or the local station voltage violates the acceptable operational limits. It cuts in/out switchable and tap......-able shunts to maintain the voltage locally. The fallback scheme is fully selfregulated according to the predefined triggering logic. In order to keep the robustness and avoid many shunts are being triggered in a short term, the inverse time characteristic is used to trigger switching one by one. This scheme...

  17. Fecal-Indicator Bacteria and Escherichia coli Pathogen Data Collected Near a Novel Sub-Irrigation Water-Treatment System in Lenawee County, Michigan, June-November 2007

    Science.gov (United States)

    Duris, Joseph W.; Beeler, Stephanie

    2008-01-01

    The U.S. Geological Survey, in cooperation with the Lenawee County Conservation District in Lenawee County, Mich., conducted a sampling effort over a single growing season (June to November 2007) to evaluate the microbiological water quality around a novel livestock reservoir wetland sub-irrigation system. Samples were collected and analyzed for fecal coliform bacteria, Escherichia coli (E. coli) bacteria, and six genes from pathogenic strains of E. coli. A total of 73 water-quality samples were collected on nine occasions from June to November 2007. These samples were collected within the surface water, shallow ground water, and the manure-treatment system near Bakerlads Farm near Clayton in Lenawee County, Mich. Fecal coliform bacteria concentrations ranged from 10 to 1.26 million colony forming units per 100 milliliters (CFU/100 mL). E. coli bacteria concentrations ranged from 8 to 540,000 CFU/100 mL. Data from the E. coli pathogen analysis showed that 73 percent of samples contained the eaeA gene, 1 percent of samples contained the stx2 gene, 37 percent of samples contained the stx1 gene, 21 percent of samples contained the rfbO157 gene, and 64 percent of samples contained the LTIIa gene.

  18. Semi-automatic Annotation System for OWL-based Semantic Search

    Directory of Open Access Journals (Sweden)

    C.-H. Liu

    2009-11-01

    Full Text Available Current keyword search by Google, Yahoo, and so on gives enormous unsuitable results. A solution to this perhaps is to annotate semantics to textual web data to enable semantic search, rather than keyword search. However, pure manual annotation is very time-consuming. Further, searching high level concept such as metaphor cannot be done if the annotation is done at a low abstraction level. We, thus, present a semi-automatic annotation system, i.e. an automatic annotator and a manual annotator. Against the web ontology language (OWL terms defined by Protégé, the former annotates the textual web data using the Knuth-Morris-Pratt (KMP algorithm, while the latter allows a user to use the terms to annotate metaphors with high abstraction. The resulting semantically-enhanced textual web document can be semantically processed by other web services such as the information retrieval system and the recommendation system shown in our example.

  19. Data Collection System for Laser Power Based on LabVIEW%基于LabVIEW的激光功率数据采集系统

    Institute of Scientific and Technical Information of China (English)

    薛竣文; 裴雪丹; 苏秉华; 孙鲁; 赵慧元; 苏禹

    2012-01-01

    Based on LabVIEW graphic design language a data collection system for laser power is designed to solve the problem of diode-pumped all-solid-state laser long term stability of laser power.Laser power meter used to convert optical parameter to electric parameter is designed by silicon photocell.The filter,attenuation glass and ground glass have to be selected carefully in order to make the photocell work in the unsaturated linear region.A micro control unit of STC89C52RC is used to manage the signal.The computer communicates with micro control unit by serial mode.Data collection interface is programmed by LabVIEW.The data collection system for the long term stability of laser power is realized at last.%针对全固体激光器的长期功率稳定性检测问题,设计并实现了基于LabVIEW图形设计语言的激光功率数据采集系统。采用光电池制作的激光功率计来实现光功率到电学参量的转换,通过合理的选取激光功率计的滤光片、衰减片和毛玻璃,保证光电池工作在非饱和的线性区。使用STC89C52RC单片机对采集到的信号进行处理,计算机与单片机采用串口通讯,数据采集界面程序采用LabVIEW编写,最终实现了激光功率长期稳定性监测系统的设计、调试与制作。

  20. AROMA: Automatic Generation of Radio Maps for Localization Systems

    CERN Document Server

    Eleryan, Ahmed; Youssef, Moustafa

    2010-01-01

    WLAN localization has become an active research field recently. Due to the wide WLAN deployment, WLAN localization provides ubiquitous coverage and adds to the value of the wireless network by providing the location of its users without using any additional hardware. However, WLAN localization systems usually require constructing a radio map, which is a major barrier of WLAN localization systems' deployment. The radio map stores information about the signal strength from different signal strength streams at selected locations in the site of interest. Typical construction of a radio map involves measurements and calibrations making it a tedious and time-consuming operation. In this paper, we present the AROMA system that automatically constructs accurate active and passive radio maps for both device-based and device-free WLAN localization systems. AROMA has three main goals: high accuracy, low computational requirements, and minimum user overhead. To achieve high accuracy, AROMA uses 3D ray tracing enhanced wi...

  1. Goal-Oriented Data Collection Framework in Configuration Projects

    DEFF Research Database (Denmark)

    Shafiee, Sara; Hvam, Lars; Kristjansdottir, Katrin

    2015-01-01

    This article proposes a systematic framework for data collection when executing Product Configuration System (PCS) projects. Since the data collection in PCS is one of the most time consuming tasks, a systematic framework to handle and manage the large amount of complex data in the early stages...... of the PCS project is needed. The framework was developed based on the current literature in the field and revised during testing at a case company. The framework has proven to provide a structural approach for data collection, which saved the company both time and money in the initial phases of the PCS...... project. The framework consists of five steps, which are; establishing a goal and the methods for stakeholder analysis, categorize and group the data collection, prioritizing of products and functionalities, collection and validation of the data by domain experts and finally analysis, documentation...

  2. 过程数据采集与分析系统在冷连轧机组中的应用%Application of automatic acquisition and analysis system of process data for tandem cold mill

    Institute of Scientific and Technical Information of China (English)

    侯永刚; 秦大伟; 费静; 张岩; 刘宝权; 宋君

    2012-01-01

    当代冷连轧带钢生产线具有生产速度快、控制精度高的特点,因此在实际生产过程中,需要一种能够对生产机组各种过程数据进行高效处理的数据采集系统。鞍钢冷轧钢板(莆田)有限公司冷连轧机组使用iba公司的PDA设备,组建了一套可对冷连轧机各种运行过程数据进行实时高速采集、监控、记录及分析的数据采集系统,该系统的使用为生产技术人员快速诊断冷连轧机组故障提供了有力的数据支持。%It has the characteristic of high production speed and control precision in modern cold rolling production line. During actual process, a kind of data acquisition system that can efficiently deal with all kinds of process data in cold strip mill is required. According to the practical application in Angang Putian Cold Strip Mill, a set of data acquisition system is built with PDA equipment of iba Company, which collects, monitors, records and analyzes all kinds of running process data of cold tan- dem mill with real-time and high-speed. This system provides strong data support for the rapid diagno- sis of cold rolling unit fault to production technological operators.

  3. Physical and biological data collected along the Texas, Mississippi, and Florida Gulf coasts in the Gulf of Mexico as part of the Harmful Algal BloomS Observing System from 19 Aug 1953 to 11 July 2014 (NODC Accession 0120767)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — HABSOS (Harmful Algal BloomS Observing System) is a data collection and distribution system for harmful algal bloom (HAB) information in the Gulf of Mexico. The...

  4. Automatic Creation of quality multi-word Lexica from noisy text data

    OpenAIRE

    Frontini, Francesca; Quochi, Valeria; Rubino, Francesco

    2012-01-01

    This paper describes the design of a tool for the automatic creation of multi-word lexica that is deployed as a web service and runs on automatically web-crawled data within the framework of the PANACEA platform. The main purpose of our task is to provide a (computationally "light") tool that creates a full high quality lexical resource of multi-word items. Within the platform, this tool is typically inserted in a work flow whose first step is automatic web-crawling. Therefore, the input data...

  5. A Study of Applications of Multiagent System Specifications and the Key Techniques in Automatic Abstracts System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this thesis, multiagent system specifications, multiagent system architecture s , agent communication languages and agent communication protocols, automatic abs tracting based on multiagent technologies are studied. Some concerned problems o f designs and realization of automatic abstracting systems based on multiagent t echnologies are studied, too. Chapter 1 shows the significance and objectives of the thesis, its main contents are summarized, and innovations of the thesis are showed. Some basic concepts of agents and multiagent systems are studied in Cha pter 2. The definitions of agents and multiagent systems are given, and the theo ry, technologies and applications of multiagent systems are summarized. Furtherm ore, some important studying trends of multiagent systems are set forward. Multi agent system specifications are studied in Chapter 3. MAS/KIB—a multiagent syst em specification is built using mental states such as K(Know), B(Belief) , and I(Intention), its grammar and semanteme are discussed, axioms and infe rence rules are given, and some properties are researched. We also compare MAS/K IB with other existing specifications. MAS/KIB has the following characteristics : (1) each agent has its own world outlook; (2) no global data in the system; (3 ) processes of state changes are used as indexes to systems; (4) it has the char acteristics of not only time series logic but also dynamic logic; and (5) intera ctive actions are included. The architectures of multiagent systems are studied in Chapter 4. First, we review some typical architecture of multiagent systems, agent network architecture, agent federated architecture, agent blackboard archi tecture, and Foundation of Intelligent Physical Agent(FIPA) architecture. For th e first time, we set forward and study the layering and partitioning models of t he architectures of multiagent systems, organizing architecture models, and inte roperability architecture model of multiagent

  6. An Automatic and Real-time Restoration of Gamma Dose Data by Radio Telemetry

    International Nuclear Information System (INIS)

    On-line gamma monitoring system based on a high pressurized ionization chamber has been used for determining airborne doses surrounding HANARO research reactor at KAERI (Korea Atomic Energy Research Institute). It is composed of a network of six monitoring stations and an on-line computer system. It has been operated by radio telemetry with a radio frequency of 468.8 MHz, which is able to transmit the real-time dose data measured from a remote ion chamber to the central computer for ten seconds-to seconds. Although radio telemetry has several advantages such as an effective and economical transmission, there is one main problem that data loss happen because each monitoring post only stores 300 radiation data points, which covers the previous sequential data of 50 minutes from the present in the case of a recording interval time of 10 seconds It is possible to restore the lost data by an off-line process such as a floppy disk or portable memory disk but it is ineffective method at the real-time monitoring system. Restoration, storage, and display of the current data as well as the lost data are also difficult in the present system. In this paper, an automatic and real-time restoration method by radio telemetry will be introduced

  7. Automatic system for measuring the zirconium liner and Zircaloy-2 thickness of zirconium liner tubes

    International Nuclear Information System (INIS)

    This paper reports on an automatic system to measure the zirconium liner thickness and Zircaloy-2 thickness of Zircaloy tubes with a zirconium liner for nuclear reactors. This system uses an electromagnetic probe connected to a data processing unit for measuring the liner thickness, an ultrasonic inspection system for measuring the wall-thickness, and a computer for calculating the Zircaloy-2 thickness from the liner thickness and wall-thickness. Fully automatic measurements on zirconium liner thickness and Zircaloy-2 thickness are performed with high accuracy to an order of 2 μm. This newly developed system is very useful in assuring the liner layer and Zircaloy-2 thickness in the production of high-quality cladding tubes

  8. Automatic convey or System with In–Process Sorting Mechanism using PLC and HMI System

    Directory of Open Access Journals (Sweden)

    Y V Aruna

    2015-11-01

    Full Text Available Programmable logic controllers are widely used in many manufacturing process like machinery packaging material handling automatic assembly. These are special type of microprocessor based controller used for any application that needs any kind of electrical controller including lighting controller and HVAC control system. Automatic conveyor system is a computerized control method of controlling and managing the sorting mechanism at the same time maintaining the efficiency of the industry & quality of the products.HMI for automatic conveyor system is considered the primary way of controlling each operation. Text displays are available as well as graphical touch screens. It is used in touch panels and local monitoring of machines. This paper deals with the efficient use of PLC in automatic conveyor system and also building the accuracy in it.

  9. 30 CFR 75.1103-4 - Automatic fire sensor and warning device systems; installation; minimum requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensor and warning device...-UNDERGROUND COAL MINES Fire Protection § 75.1103-4 Automatic fire sensor and warning device systems; installation; minimum requirements. (a) Effective December 31, 2009, automatic fire sensor and warning...

  10. Collection and Management of Shop-floor ControllerData

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    This paper describes shop-floor data collection and management. An arc hitecture is presented for a shop-floor data collection and management system b a sed on the analysis of the features of these data. Two critical aspects of the s ystem are discussed in detail: the various communication protocols between compu ters and machines, and the real-time demands of the shop-floor controller.

  11. Reliability of Routinely Collected Hospital Data for Child Maltreatment Surveillance

    OpenAIRE

    Campbell Margaret; Waller Garry S; Scott Debbie A; McKenzie Kirsten

    2011-01-01

    Abstract Background Internationally, research on child maltreatment-related injuries has been hampered by a lack of available routinely collected health data to identify cases, examine causes, identify risk factors and explore health outcomes. Routinely collected hospital separation data coded using the International Classification of Diseases and Related Health Problems (ICD) system provide an internationally standardised data source for classifying and aggregating diseases, injuries, causes...

  12. Automatic processing and modeling of GPR data for pavement thickness and properties

    Science.gov (United States)

    Olhoeft, Gary R.; Smith, Stanley S., III

    2000-04-01

    A GSSI SIR-8 with 1 GHz air-launched horn antennas has been modified to acquire data from a moving vehicle. Algorithms have been developed to acquire the data, and to automatically calibrate, position, process, and full waveform model it without operator intervention. Vehicle suspension system bounce is automatically compensated (for varying antenna height). Multiple scans are modeled by full waveform inversion that is remarkably robust and relatively insensitive to noise. Statistical parameters and histograms are generated for the thickness and dielectric permittivity of concrete or asphalt pavements. The statistical uncertainty with which the thickness is determined is given with each thickness measurement, along with the dielectric permittivity of the pavement material and of the subgrade material at each location. Permittivities are then converted into equivalent density and water content. Typical statistical uncertainties in thickness are better than 0.4 cm in 20 cm thick pavement. On a Pentium laptop computer, the data may be processed and modeled to have cross-sectional images and computed pavement thickness displayed in real time at highway speeds.

  13. Automatic Data Filter Customization Using a Genetic Algorithm

    Science.gov (United States)

    Mandrake, Lukas

    2013-01-01

    This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.

  14. An investigation of prior knowledge in Automatic Music Transcription systems.

    Science.gov (United States)

    Cazau, Dorian; Revillon, Guillaume; Krywyk, Julien; Adam, Olivier

    2015-10-01

    Automatic transcription of music is a long-studied research field with many operational systems available commercially. In this paper, a generic transcription system able to host various prior knowledge parameters has been developed, followed by an in-depth investigation of their impact on music transcription. Explicit links between musical knowledge and algorithmic formalism have been made. Musical knowledge covers classes of timbre, musicology, and playing style of an instrument repertoire. An evaluation sound corpus gathering musical pieces played by human performers from three different instrument repertoires, namely, classical piano, steel-string acoustic guitar, and the marovany zither from Madagascar, has been developed. The different components of musical knowledge have been successively incorporated in a complete transcription system, consisting mainly of a Probabilistic Latent Component Analysis algorithm post-processed with a Hidden Markov Model, and their impact on transcription results have been comparatively evaluated. PMID:26520339

  15. Modeling of a Multiple Digital Automatic Gain Control System

    Institute of Scientific and Technical Information of China (English)

    WANG Jingdian; LU Xiuhong; ZHANG Li

    2008-01-01

    Automatic gain control (AGC) has been used in many applications. The key features of AGC, including a steady state output and static/dynamic timing response, depend mainly on key parameters such as the reference and the filter coefficients. A simple model developed to describe AGC systems based on several simple assumptions shows that AGC always converges to the reference and that the timing constant depends on the filter coefficients. Measures are given to prevent oscillations and limit cycle effects. The simple AGC system is adapted to a multiple AGC system for a TV tuner in a much more efficient model. Simulations using the C language are 16 times faster than those with MATLAB, and 10 times faster than those with a mixed register transfer level (RTL)-simulation program with integrated circuit emphasis (SPICE) model.

  16. Automatic Segmentation of Raw LIDAR Data for Extraction of Building Roofs

    OpenAIRE

    Mohammad Awrangjeb; Fraser, Clive S.

    2014-01-01

    Automatic extraction of building roofs from remote sensing data is important for many applications, including 3D city modeling. This paper proposes a new method for automatic segmentation of raw LIDAR (light detection and ranging) data. Using the ground height from a DEM (digital elevation model), the raw LIDAR points are separated into two groups. The first group contains the ground points that form a “building mask”. The second group contains non-ground points that are clustered using the b...

  17. Design of Data Collection System of Dynamic Electronic Railway Scale%动态电子轨道衡数据采集系统设计

    Institute of Scientific and Technical Information of China (English)

    徐栋; 徐海贤; 崔婷婷

    2012-01-01

    提出了一种动态电子轨道衡数据采集系统的设计方案,重点介绍了影响系统性能的称重通道的硬件设计.该系统采用可编程增益放大器AD625及二阶低通滤波电路对传感器信号进行初步处理;采用带光电隔离的模入接口卡PCI-8325B实现AD转换,并通过PCI总线将数据送入前端主机进行软件处理,处理后的数据通过网桥或Modem送往终端计算机,实现数据的统一管理.试验结果表明,该系统满足工程实际应用需要.%The paper proposed a design scheme of data collection system of dynamic electronic railway scale and mainly introduced hardware design of weighing channel which influences performance of the system. The system uses AD625 and second-order low-pass filtering circuit to make primary process for sensor signal, and uses PCI-8325B to realize AD conversion for the processed data and sends the data to front host to make software process through PCI bus, then sends the data to terminal computer through network bridge or Modem to realize unified management for the data. The test result showed that the system meets with actual application requirements of object.

  18. MicroED data collection and processing

    Energy Technology Data Exchange (ETDEWEB)

    Hattne, Johan; Reyes, Francis E.; Nannenga, Brent L.; Shi, Dan; Cruz, M. Jason de la [Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147 (United States); Leslie, Andrew G. W. [Medical Research Council Laboratory of Molecular Biology, Cambridge (United Kingdom); Gonen, Tamir, E-mail: gonent@janelia.hhmi.org [Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA 20147 (United States)

    2015-07-01

    The collection and processing of MicroED data are presented. MicroED, a method at the intersection of X-ray crystallography and electron cryo-microscopy, has rapidly progressed by exploiting advances in both fields and has already been successfully employed to determine the atomic structures of several proteins from sub-micron-sized, three-dimensional crystals. A major limiting factor in X-ray crystallography is the requirement for large and well ordered crystals. By permitting electron diffraction patterns to be collected from much smaller crystals, or even single well ordered domains of large crystals composed of several small mosaic blocks, MicroED has the potential to overcome the limiting size requirement and enable structural studies on difficult-to-crystallize samples. This communication details the steps for sample preparation, data collection and reduction necessary to obtain refined, high-resolution, three-dimensional models by MicroED, and presents some of its unique challenges.

  19. 78 FR 68908 - Proposed Information Collection (Veterans Transportation Service Data Collection); Activity...

    Science.gov (United States)

    2013-11-15

    ... AFFAIRS Proposed Information Collection (Veterans Transportation Service Data Collection); Activity... needed to evaluate the Veterans Transportation Service Data Collection program to ensure Veterans... Control No. 2900-NEW (Veterans Transportation Service Data Collection)'' in any correspondence. During...

  20. Collection of VLE data for acid gas - alkanolamine systems using Fourier transform infrared spectroscopy. Final report, September 29, 1990--September 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Bullin, J.A.; Rogers, W.J.

    1996-11-01

    This report describes research from September 29, 1990 through September 30, 1996, involving the development a novel Fourier transform infrared (FTIR) spectroscopic apparatus and method for measuring vapor - liquid equilibrium (VLE) systems of carbon dioxide and hydrogen sulfide with aqueous alkanolamine solutions. The original apparatus was developed and modified as it was used to collect VLE data on acid gas systems. Vapor and liquid calibrations were performed for spectral measurements of hydrogen sulfide and carbon dioxide in the vapor and in solution with aqueous diethanolamine (DEA) and methyldiethanolamine (MDEA). VLE measurements were made of systems of hydrogen sulfide and carbon dioxide in 20 wt % DEA at 50{degrees}C and 40{degrees}C. VLE measurements were made of systems of hydrogen sulfide and carbon dioxide in 50 wt% and 23 wt% MDEA at 40{degrees}C and in 23 wt% MDEA at 50{degrees}C. VLE measurements were made of systems of hydrogen sulfide and carbon dioxide in 35 wt% MDEA + 5 wt% DEA and in 35 wt% MDEA + 10 wt% DEA at 40{degrees}C and 50{degrees}C. Measurements were made of residual amounts of carbon dioxide in each VLE system. The new FTIR spectrometer is now a consistently working and performing apparatus.

  1. Environmental data collection using autonomous Wave Gliders

    OpenAIRE

    Hermsdorfer, Kathryn M.

    2014-01-01

    Approved for public release; distribution is unlimited The Sensor Hosting Autonomous Remote Craft (SHARC), also known as Wave Glider, is an autonomous ocean vehicle powered by wave motion. This slow-moving platform makes long-term deployments and environmental data collection feasible, especially in data sparse regions or hazardous environments. The standard SHARC hosts a meteorological station (Airmar PB200) that samples air pressure, temperature, wind speed and wind direction at 1.12 m. ...

  2. Real-time directional wave data collection

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G.; Pednekar, P.S.

    The wave measurements carried out along the east and west coasts off India at 13 locations using the directional waverider buoys are referred in this paper. The total number of buoy days are 4501 and out of which the data collected are 4218 days...

  3. Training course 'Fisheries data collection and analysis'

    NARCIS (Netherlands)

    Heijden, van der P.G.M.

    2007-01-01

    Course description of the course “Fisheries data collection and analysis”, held from October 1st till October 19th 2007 and organised by the Programme for Capacity development & Institutional Change of Wageningen International in cooperation with Wageningen University – Aquaculture and Fisheries

  4. Data collection on risk factors in pregnancy

    NARCIS (Netherlands)

    Zetstra-van der Woude, Alethea Priscilla

    2016-01-01

    This thesis aims to investigate the different methods of data collection of risk factors in pregnancy. Several observational epidemiologic study designs were used to assess associations between risk factors and negative birth outcomes. We especially looked at the use of folic acid around pregnancy a

  5. An Introduction to BYOE Mobile Data Collection

    Science.gov (United States)

    Rocchio, Rose A.

    2014-01-01

    Smartphone ownership among college-aged Americans is high and growing, and many students own more than one mobile device. Such devices are increasingly incorporated into the academic lives of students, and the era of "bring your own everything" presents new opportunities and challenges for higher education. Mobile data collection is the…

  6. Automatic power distribution backup personal computer system for Hokuriku Electric Power Co., Inc.; Hokuriku Denryoku (kabu) haiden jidoka backup yo pasokon system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-01-10

    Hokuriku Electric Power Co., Inc., and Fuji Electric Co., Ltd., have jointly developed a personal computer system for automatic power distribution system backup which will monitor switches and perform independent operations in case of automatic power distribution system stoppage for example for inspection or maintenance. Under this system, backup operation is easily accomplished by switching the routine business personal computer from the in-house intranet to the automatic power distribution system LAN (local area network). The newly developed system is characterized by (1) its compatibility with a personal computer with Windows NT running thereon, (2) the downloading of data related to facilities that may be done via either intranet or automatic power distribution system, (3) its skeleton display of each power distribution line in the power distribution system chart, and (4) its capability of remote monitoring and control of pole equipment such as switches and SVR (step voltage regulator). (translated by NEDO)

  7. Automatic fracture density update using smart well data and artificial neural networks

    Science.gov (United States)

    Al-Anazi, A.; Babadagli, T.

    2010-03-01

    This paper presents a new methodology to continuously update and improve fracture network models. We begin with a hypothetical model whose fracture network parameters and geological information are known. After generating the "exact" fracture network with known characteristics, the data were exported to a reservoir simulator and simulations were run over a period of time. Intelligent wells equipped with downhole multiple pressure and flow sensors were placed throughout the reservoir and put into production. These producers were completed in different fracture zones to create a representative pressure and production response. We then considered a number of wells of which static (cores and well logs) and dynamic (production) data were used to model well fracture density. As new wells were opened, historical static and dynamic data from previous wells and static data from the new wells were used to update the fracture density using Artificial Neural Networks (ANN). The accuracy of the prediction model depends significantly on the representation of the available data of the existing fracture network. The importance of conventional data (surface production data) and smart well data prediction capability was also investigated. Highly sensitive input data were selected through a forward selection scheme to train the ANN. Well geometric locations were included as a new link in the ANN regression process. Once the relationship between fracture network parameters and well performance data was established, the ANN model was used to predict fracture density at newly drilled locations. Finally, an error analysis through a correlation coefficient and percentage absolute relative error performance was performed to examine the accuracy of the proposed inverse modeling methodology. It was shown that fracture dominated production performance data collected from both conventional and smart wells allow for automatically updating the fracture network model. The proposed technique helps

  8. Model Driven Development of Data Sensitive Systems

    DEFF Research Database (Denmark)

    Olsen, Petur

    2014-01-01

    Model-driven development strives to use formal artifacts during the development process. Formal artifacts enables automatic analyses of some aspects of the system under development. This serves to increase the understanding of the (intended) behavior of the system as well as increasing error...... to the values of variables. This theses strives to improve model-driven development of such data-sensitive systems. This is done by addressing three research questions. In the first we combine state-based modeling and abstract interpretation, in order to ease modeling of data-sensitive systems, while allowing...... efficient model-checking and model-based testing. In the second we develop automatic abstraction learning used together with model learning, in order to allow fully automatic learning of data-sensitive systems to allow learning of larger systems. In the third we develop an approach for modeling and model...

  9. 14 CFR 25.672 - Stability augmentation and automatic and power-operated systems.

    Science.gov (United States)

    2010-01-01

    ... system or in any other automatic or power-operated system which could result in an unsafe condition if...) The design of the stability augmentation system or of any other automatic or power-operated system... exceptional pilot skill or strength, by either the deactivation of the system, or a failed portion thereof,...

  10. 14 CFR 27.672 - Stability augmentation, automatic, and power-operated systems.

    Science.gov (United States)

    2010-01-01

    ... stability augmentation system or in any other automatic or power-operated system which could result in an... systems. (b) The design of the stability augmentation system or of any other automatic or power-operated system must allow initial counteraction of failures without requiring exceptional pilot skill or...

  11. AROMA-AIRWICK: a CHLOE/CDC-3600 system for the automatic identification of spark images and their association into tracks

    International Nuclear Information System (INIS)

    The AROMA-AIRWICK System for CHLOE, an automatic film scanning equipment built at Argonne by Donald Hodges, and the CDC-3600 computer is a system for the automatic identification of spark images and their association into tracks. AROMA-AIRWICK has been an outgrowth of the generally recognized need for the automatic processing of high energy physics data and the fact that the Argonne National Laboratory has been a center of serious spark chamber development in recent years

  12. Automatic Voltage Control (AVC) System under Uncertainty from Wind Power

    DEFF Research Database (Denmark)

    Qin, Nan; Abildgaard, Hans; Flynn, Damian;

    2016-01-01

    An automatic voltage control (AVC) system maintains the voltage profile of a power system in an acceptable range and minimizes the operational cost by coordinating the regulation of controllable components. Typically, all of the parameters in the optimization problem are assumed to be certain...... and constant in the decision making process. However, for high shares of wind power, uncertainty in the decision process due to wind power variability may result in an infeasible AVC solution. This paper proposes a voltage control approach which considers the voltage uncertainty from wind power productions....... The proposed method improves the performance and the robustness of a scenario based approach by estimating the potential voltage variations due to fluctuating wind power production, and introduces a voltage margin to protect the decision against uncertainty for each scenario. The effectiveness of the proposed...

  13. Human factors in automatic image retrieval system design and evaluation

    Science.gov (United States)

    Jaimes, Alejandro

    2006-01-01

    Image retrieval is a human-centered task: images are created by people and are ultimately accessed and used by people for human-related activities. In designing image retrieval systems and algorithms, or measuring their performance, it is therefore imperative to consider the conditions that surround both the indexing of image content and the retrieval. This includes examining the different levels of interpretation for retrieval, possible search strategies, and image uses. Furthermore, we must consider different levels of similarity and the role of human factors such as culture, memory, and personal context. This paper takes a human-centered perspective in outlining levels of description, types of users, search strategies, image uses, and human factors that affect the construction and evaluation of automatic content-based retrieval systems, such as human memory, context, and subjectivity.

  14. Adaptive neuro-fuzzy inference system based automatic generation control

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, S.H.; Etemadi, A.H. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran)

    2008-07-15

    Fixed gain controllers for automatic generation control are designed at nominal operating conditions and fail to provide best control performance over a wide range of operating conditions. So, to keep system performance near its optimum, it is desirable to track the operating conditions and use updated parameters to compute control gains. A control scheme based on artificial neuro-fuzzy inference system (ANFIS), which is trained by the results of off-line studies obtained using particle swarm optimization, is proposed in this paper to optimize and update control gains in real-time according to load variations. Also, frequency relaxation is implemented using ANFIS. The efficiency of the proposed method is demonstrated via simulations. Compliance of the proposed method with NERC control performance standard is verified. (author)

  15. Automatic Vehicle License Recognition Based on Video Vehicular Detection System

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaoxuan; CHEN Yang; HE Yinghua; WU Jun

    2006-01-01

    Traditional methods of license character extraction cannot meet the requirements of recognition accuracy and speed rendered by the video vehicular detection system.Therefore, a license plate localization method based on multi-scale edge detection and a character segmentation algorithm based on Markov random field model is presented.Results of experiments demonstrate that the method yields more accurate license character extraction in contrast to traditional localization method based on edge detection by difference operator and character segmentation based on threshold.The accuracy increases from 90% to 94% under preferable illumination, while under poor condition, it increases more than 5%.When the two improved algorithms are used, the accuracy and speed of automatic license recognition meet the system's requirement even under the noisy circumstance or uneven illumination.

  16. Automatic Meter Reading and Theft Control System by Using GSM

    Directory of Open Access Journals (Sweden)

    P. Rakesh Malhotra

    2013-04-01

    Full Text Available This paper deals with automatic meter reading and theft control system in energy meter. Current transformer is used to measure the total power consumption for house or industrial purpose. This recorded reading is transmitted to the electricity board for every 60 days once. For transmitting the reading of energy meter GSM module is used. To avoid theft, infrared sensor is placed in the screw portion of energy meter seal. If the screw is removed from the meter a message is sent to the electricity board. The measuring of energy meter and monitoring of IR sensor is done with a PIC microcontroller.The informative system will be helpful for the electricity board to monitor the entire supply and the correct billing accordingly without any mishap. This model reduces the manual manipulation work andtheft control.

  17. Entrance C - New Automatic Number Plate Recognition System

    CERN Multimedia

    2013-01-01

    Entrance C (Satigny) is now equipped with a latest-generation Automatic Number Plate Recognition (ANPR) system and a fast-action road gate.   During the month of August, Entrance C will be continuously open from 7.00 a.m. to 7.00 p.m. (working days only). The security guards will open the gate as usual from 7.00 a.m. to 9.00 a.m. and from 5.00 p.m. to 7.00 p.m. For the rest of the working day (9.00 a.m. to 5.00 p.m.) the gate will operate automatically. Please observe the following points:       Stop at the STOP sign on the ground     Position yourself next to the card reader for optimal recognition     Motorcyclists must use their CERN card     Cyclists may not activate the gate and should use the bicycle turnstile     Keep a safe distance from the vehicle in front of you   If access is denied, please check that your vehicle regist...

  18. Enhancement of the automatic ultrasonic signal processing system using digital technology

    International Nuclear Information System (INIS)

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  19. Enhancement of the automatic ultrasonic signal processing system using digital technology

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Park, H. Y.; Suh, Y. S.; Kim, D. Hoon; Huh, S.; Sung, S. H.; Jang, G. S.; Ryoo, S. G.; Choi, J. H.; Kim, Y. H.; Lee, J. C.; Kim, D. Hyun; Park, H. J.; Kim, Y. C.; Lee, J. P.; Park, C. H.; Kim, M. S

    1999-12-01

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  20. Application of Two Class Data Auditing System on Ambient Air Automatic Monitoring in Shanghai%市和区县两级环境空气自动监测数据审核系统在上海的应用

    Institute of Scientific and Technical Information of China (English)

    包权

    2015-01-01

    由上海市环境监测中心和软件公司共同开发的市和区县两级环境空气自动监测数据审核系统,主要由区县监测站负责辖区自动监测数据的一级审核,市级监测站承担二级审核的责任。作为环境空气质量保证/质量控制体系的重要一环,新系统的建立极大地提高了上海市环境空气自动监测数据的有效性和数据质量。%The two class data auditing system on ambient air automatic monitoring is developed by Shanghai Environmental Monito-ring Center (SEMC)and software company.The first class auditing of automatic monitoring data is implemented by district monito-ring stations,and the second class is by SEMC.As an important part of QA /QC system on ambient air monitoring,the quality and validity of automatic monitoring data has been greatly enhanced after the establishment of new auditing system in Shanghai.