WorldWideScience

Sample records for automatic data collection systems

  1. Automatic Weissenberg data collection system for time-resolved protein crystallography

    CERN Document Server

    Sakabe, N; Higashi, T; Igarashi, N; Suzuki, M; Watanabe, N; Sasaki, K

    2001-01-01

    A totally new type of fully automatic Weissenberg data-collection system called 'Galaxy' was developed and was installed at the Photon Factory. This automatic data collection system consists of a rotated-inclined focusing monochromator, a screenless Weissenberg type camera, an image reader, an eraser, a cassette transportation mechanism, a control console and a safety and high-speed computer network system linking a control console, data processing computers and data servers. The special characteristics of this system are a Weissenberg camera with a fully cylindrical cassette which can be rotated to exchange a frame, a maximum number of 36 images to be recorded in an IP cassette, and a very high speed IP reader with five reading heads. Since the frame exchange time is only a few seconds, this system is applicable for time-resolved protein crystallography at seconds or minutes of time-scale.

  2. Discrete regulation of transfer function of a circuit in experimental data automatic collection and processing systems

    Science.gov (United States)

    Lyubashevskiy, G. S.; Petrov, A. A.; Sanayev, I. A.; Frishberg, V. E.

    1973-01-01

    A device for discrete control of the circuit transfer function in automatic analog data processing systems is reported that coordinates the dynamic range of the vibration level change with the signal range of the processing device output. Experimental verification of the device demonstrates that its maximum control speed does not exceed 0.5 sec for a frequency nonuniformity of about 10%.

  3. Automatic meta-data collection of STP observation data

    Science.gov (United States)

    Ishikura, S.; Kimura, E.; Murata, K.; Kubo, T.; Shinohara, I.

    2006-12-01

    For the geo-science and the STP (Solar-Terrestrial Physics) studies, various observations have been done by satellites and ground-based observatories up to now. These data are saved and managed at many organizations, but no common procedure and rule to provide and/or share these data files. Researchers have felt difficulty in searching and analyzing such different types of data distributed over the Internet. To support such cross-over analyses of observation data, we have developed the STARS (Solar-Terrestrial data Analysis and Reference System). The STARS consists of client application (STARS-app), the meta-database (STARS- DB), the portal Web service (STARS-WS) and the download agent Web service (STARS DLAgent-WS). The STARS-DB includes directory information, access permission, protocol information to retrieve data files, hierarchy information of mission/team/data and user information. Users of the STARS are able to download observation data files without knowing locations of the files by using the STARS-DB. We have implemented the Portal-WS to retrieve meta-data from the meta-database. One reason we use the Web service is to overcome a variety of firewall restrictions which is getting stricter in recent years. Now it is difficult for the STARS client application to access to the STARS-DB by sending SQL query to obtain meta- data from the STARS-DB. Using the Web service, we succeeded in placing the STARS-DB behind the Portal- WS and prevent from exposing it on the Internet. The STARS accesses to the Portal-WS by sending the SOAP (Simple Object Access Protocol) request over HTTP. Meta-data is received as a SOAP Response. The STARS DLAgent-WS provides clients with data files downloaded from data sites. The data files are provided with a variety of protocols (e.g., FTP, HTTP, FTPS and SFTP). These protocols are individually selected at each site. The clients send a SOAP request with download request messages and receive observation data files as a SOAP Response with

  4. Design and implementation of automatic color information collection system

    Science.gov (United States)

    Ci, Wenjie; Xie, Kai; Li, Tong

    2015-12-01

    In liquid crystal display (LCD) colorimetric characterization, it needs to convert RGB the device-dependent color space to CIEXYZ or CIELab the device-independent color space. Namely establishing the relationship between RGB and CIE using the data of device color and the corresponding data of CIE. Thus a color automatic message acquisition software is designed. We use openGL to fulfill the full screen display function, write c++ program and call the Eyeone equipment library functions to accomplish the equipment calibration, set the sample types, and realize functions such as sampling and preservation. The software can drive monitors or projectors display the set of sample colors automatically and collect the corresponding CIE values. The sample color of RGB values and the acquisition of CIE values can be stored in a text document, which is convenient for future extraction and analysis. Taking the cubic polynomial as an example, each channel is sampled of 17 sets using this system. And 100 sets of test data are also sampled. Using the least square method we can get the model. The average of color differences are around 2.4874, which is much lower than the CIE2000 commonly required level of 6.00.The successful implementation of the system saves the time of sample color data acquisition, and improves the efficiency of LCD colorimetric characterization.

  5. 全自动航空伽玛能谱数据收录系统%The Automatic Data Collection System of Airborne Gamma

    Institute of Scientific and Technical Information of China (English)

    魏林; 曾国强; 葛良全

    2012-01-01

    介绍了以研华ACP - 4001型号一体化工控机为平台研发的全自动航空伽玛能谱数据收录系统.该系统通过GPS秒脉冲时钟完成全局时钟同步驱动.采用并口完成数据传输,串口获取GPS定位信息,PT100铂电阻温度传感器采集温度值.软件系统基于LabWindows/CVI与VisualC++6.0平台开发,完成数据采集底层动态链接库与友好界面的设计.实验结果表明:收录系统数据收录准确、快速、完整,达到设计目的.%The automatic data collection system of airborne gamma is researched based on the ACP -4001 model integrated industrial control computer. The system uses the PPS clock complete synchronization global drive. The system uses high - speed parallel port complete mass data transmission, serial ports for getting GPS positioning information, PT100 platinum resistance temperature sensors collect temperature. The software system which completes data acquisition bottom DLL and friendly interface design is based on Labwindows/CVI and Vi-sualC + + 6. 0 platform development. The experimental results show that the system collecte data accurate, fast, integrity and achieve the goals of design.

  6. MAC, A System for Automatically IPR Identification, Collection and Distribution

    Science.gov (United States)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  7. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  8. ATCOM: Automatically Tuned Collective Communication System for SMP Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Meng-Shiou [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    Conventional implementations of collective communications are based on point-to-point communications, and their optimizations have been focused on efficiency of those communication algorithms. However, point-to-point communications are not the optimal choice for modern computing clusters of SMPs due to their two-level communication structure. In recent years, a few research efforts have investigated efficient collective communications for SMP clusters. This dissertation is focused on platform-independent algorithms and implementations in this area. There are two main approaches to implementing efficient collective communications for clusters of SMPs: using shared memory operations for intra-node communications, and overlapping inter-node/intra-node communications. The former fully utilizes the hardware based shared memory of an SMP, and the latter takes advantage of the inherent hierarchy of the communications within a cluster of SMPs. Previous studies focused on clusters of SMP from certain vendors. However, the previously proposed methods are not portable to other systems. Because the performance optimization issue is very complicated and the developing process is very time consuming, it is highly desired to have self-tuning, platform-independent implementations. As proven in this dissertation, such an implementation can significantly out-perform the other point-to-point based portable implementations and some platform-specific implementations. The dissertation describes in detail the architecture of the platform-independent implementation. There are four system components: shared memory-based collective communications, overlapping mechanisms for inter-node and intra-node communications, a prediction-based tuning module and a micro-benchmark based tuning module. Each component is carefully designed with the goal of automatic tuning in mind.

  9. ATCOM: Automatically Tuned Collective Communication System for SMP Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Meng-Shiou Wu

    2005-12-17

    Conventional implementations of collective communications are based on point-to-point communications, and their optimizations have been focused on efficiency of those communication algorithms. However, point-to-point communications are not the optimal choice for modern computing clusters of SMPs due to their two-level communication structure. In recent years, a few research efforts have investigated efficient collective communications for SMP clusters. This dissertation is focused on platform-independent algorithms and implementations in this area. There are two main approaches to implementing efficient collective communications for clusters of SMPs: using shared memory operations for intra-node communications, and overlapping inter-node/intra-node communications. The former fully utilizes the hardware based shared memory of an SMP, and the latter takes advantage of the inherent hierarchy of the communications within a cluster of SMPs. Previous studies focused on clusters of SMP from certain vendors. However, the previously proposed methods are not portable to other systems. Because the performance optimization issue is very complicated and the developing process is very time consuming, it is highly desired to have self-tuning, platform-independent implementations. As proven in this dissertation, such an implementation can significantly out-perform the other point-to-point based portable implementations and some platform-specific implementations. The dissertation describes in detail the architecture of the platform-independent implementation. There are four system components: shared memory-based collective communications, overlapping mechanisms for inter-node and intra-node communications, a prediction-based tuning module and a micro-benchmark based tuning module. Each component is carefully designed with the goal of automatic tuning in mind.

  10. A geological and geophysical data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    Sudhakar, T.; Afzulpurkar, S

    A geological and geophysical data collection system using a Personal Computer is described below. The system stores data obtained from various survey systems typically installed in a charter vessel and can be used for similar applications on any...

  11. Automatic fault detection on BIPV systems without solar irradiation data

    CERN Document Server

    Leloux, Jonathan; Luna, Alberto; Desportes, Adrien

    2014-01-01

    BIPV systems are small PV generation units spread out over the territory, and whose characteristics are very diverse. This makes difficult a cost-effective procedure for monitoring, fault detection, performance analyses, operation and maintenance. As a result, many problems affecting BIPV systems go undetected. In order to carry out effective automatic fault detection procedures, we need a performance indicator that is reliable and that can be applied on many PV systems at a very low cost. The existing approaches for analyzing the performance of PV systems are often based on the Performance Ratio (PR), whose accuracy depends on good solar irradiation data, which in turn can be very difficult to obtain or cost-prohibitive for the BIPV owner. We present an alternative fault detection procedure based on a performance indicator that can be constructed on the sole basis of the energy production data measured at the BIPV systems. This procedure does not require the input of operating conditions data, such as solar ...

  12. Automatic data acquisition system for a photovoltaic solar plant

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.; Barrio, C.L.; Guerra, A.G.

    1986-01-01

    An autonomous monitoring system for photovoltaic solar plants is described. The system is able to collect data about the plant's physical and electrical characteristics and also about the environmental conditions. It may present the results on a display, if requested, but its main function is measuring periodically a set of parameters, including several points in the panel I-V characteristics, in an unattended mode. The data are stored on a magnetic tape for later processing on a computer. The system hardware and software are described, as well as their main functions.

  13. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Bowler, Matthew W., E-mail: mbowler@embl.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 avenue des Martyrs, F-38042 Grenoble (France); Université Grenoble Alpes-EMBL-CNRS, 71 avenue des Martyrs, F-38042 Grenoble (France); Nurizzo, Didier, E-mail: mbowler@embl.fr; Barrett, Ray; Beteva, Antonia; Bodin, Marjolaine [European Synchrotron Radiation Facility, 71 avenue des Martyrs, F-38043 Grenoble (France)

    2015-10-03

    MASSIF-1 (ID30A-1) is a new beamline dedicated to the completely automatic characterization and data collection from crystals of biological macromolecules. MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined.

  14. Monitoring the Performance of the Pedestrian Transfer Function of Train Stations Using Automatic Fare Collection Data

    NARCIS (Netherlands)

    Van den Heuvel, J.P.A.; Hoogenraad, J.H.

    2014-01-01

    Over the last years all train stations in The Netherlands have been equipped with automatic fare collection gates and/or validators. All public transport passengers use a smart card to pay their fare. In this paper we present a monitor for the performance of the pedestrian function of train stations

  15. Argus phase II optical data collection system

    Science.gov (United States)

    Wasson, Wayne E.

    1996-11-01

    The Argus aircraft is a highly modified NC-135E fitted with an infrared and ultraviolet-visible sensor suite for radiometric and spectral data collection. Each suite is operated independently with its own separate gimbal for precision pointing, telescope, and relay optics. The system includes a silica window for the ultraviolet-visible, and a zinc selenide window for the infrared. The entire system was developed and fabricated in-house at the Phillips Laboratory. All sensors are calibrated as a system onboard the aircraft through a unique facility called the aircraft optical calibration facility. The data is all recorded digitally, and can be transferred to secure data reduction facilities via optical fiber. The system is modular, in that the ultraviolet-visible and infrared benches can be separated, or the entire system can be quickly removed to allow for the introduction of other sensor suites or systems. The gimbals and telescopes can be used independently of the rest of the system. The aircraft is also fitted with an anemometry system, which can be operated independently of the sensor systems. This aircraft is capable of many types of missions, and will soon be fitted with a LIDAR system for remote sensing. The philosophy in building the system is to make it capable of quick changes during mission.

  16. Automatic Boat Identification System for VIIRS Low Light Imaging Data

    Directory of Open Access Journals (Sweden)

    Christopher D. Elvidge

    2015-03-01

    Full Text Available The ability for satellite sensors to detect lit fishing boats has been known since the 1970s. However, the use of the observations has been limited by the lack of an automatic algorithm for reporting the location and brightness of offshore lighting features arising from boats. An examination of lit fishing boat features in Visible Infrared Imaging Radiometer Suite (VIIRS day/night band (DNB data indicates that the features are essentially spikes. We have developed a set of algorithms for automatic detection of spikes and characterization of the sharpness of spike features. A spike detection algorithm generates a list of candidate boat detections. A second algorithm measures the height of the spikes for the discard of ionospheric energetic particle detections and to rate boat detections as either strong or weak. A sharpness index is used to label boat detections that appear blurry due to the scattering of light by clouds. The candidate spikes are then filtered to remove features on land and gas flares. A validation study conducted using analyst selected boat detections found the automatic algorithm detected 99.3% of the reference pixel set. VIIRS boat detection data can provide fishery agencies with up-to-date information of fishing boat activity and changes in this activity in response to new regulations and enforcement regimes. The data can provide indications of illegal fishing activity in restricted areas and incursions across Exclusive Economic Zone (EEZ boundaries. VIIRS boat detections occur widely offshore from East and Southeast Asia, South America and several other regions.

  17. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  18. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool.

  19. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules

    Science.gov (United States)

    Bowler, Matthew W.; Nurizzo, Didier; Barrett, Ray; Beteva, Antonia; Bodin, Marjolaine; Caserotto, Hugo; Delagenière, Solange; Dobias, Fabian; Flot, David; Giraud, Thierry; Guichard, Nicolas; Guijarro, Mattias; Lentini, Mario; Leonard, Gordon A.; McSweeney, Sean; Oskarsson, Marcus; Schmidt, Werner; Snigirev, Anatoli; von Stetten, David; Surr, John; Svensson, Olof; Theveneau, Pascal; Mueller-Dieckmann, Christoph

    2015-01-01

    MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined. PMID:26524320

  20. System design of the METC automatic data acquisition and control system

    Energy Technology Data Exchange (ETDEWEB)

    Goff, D. R.; Armstrong, D. L.

    1982-02-01

    A system of computer programs and hardware was developed by the Instrumentation Branch of the Morgantown Energy Technology Center (METC) to provide data acquisition and control features for research projects at the site. The Automatic Data Acquisition and Control System (ADACS) has the capability of servicing up to eight individual projects simultaneously, providing data acquisition, data feedback, and process control where needed. Several novel software features - including a data table driven program, extensive feedback in real time, free format English commands, and high reliability - were incorporated to provide these functions.

  1. Integrated system to automatize information collecting for the primary health care at home.

    Science.gov (United States)

    Oliveira, Edson N; Cainelli, Jean; Pinto, Maria Eugênia B; Cazella, Silvio C; Dahmer, Alessandra

    2013-01-01

    Data collected in a consistent manner is the basis for any decision making. This article presents a system that automates data collection by community-based health workers during their visits to the residences of users of the Brazilian Health Care System (Sistema Único de Saúde - SUS) The automated process will reduce the possibility of mistakes in the transcription of visit information and make information readily available to the Ministry of Health. Furthermore, the analysis of the information provided via this system can be useful in the implementation of health campaigns and in the control of outbreaks of epidemiological diseases.

  2. Analysis of space telescope data collection system

    Science.gov (United States)

    Ingles, F.; Schoggen, W. O.

    1980-01-01

    The effects of frame synchronization loss were analyzed. A frame sync loss will create loss of data for the frame in which it occurs (since one would not know whether the preceding data was properly in sync or not) and during search from frame sync the system would be losing data. The search mode for reacquisition utilizes multiple search procedures.

  3. Designing and Building an Automatic Information Retrieval System for Handling the Arabic Data

    OpenAIRE

    2005-01-01

    This paper aimed to design and build an Automatic Information Retrieval System to handle the Arabic data. Also, this paper presents some type of comparison between the retrieval results using the vector space model in two different indexing methods: the full-ward indexing and the root indexing. The proposed Automatic Information Retrieval system was implemented and built using a traditional model technique: Vector Space Model (VSM) where the cosine measure similarity was used. The output resu...

  4. Low-cost automatic activity data recording system

    Directory of Open Access Journals (Sweden)

    Moraes M.F.D.

    1997-01-01

    Full Text Available We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT. Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds when the activity occurred. As a result, the computer builds up several files (one per detector/sensor containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.. Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

  5. Automatic Identification System (AIS) Collection and Reach-back System: System Description

    Science.gov (United States)

    2014-08-20

    Supply Module ( PSM ) is shown in Fig. 14; specification highlights are listed below. The RPC and BPC use the same model PSM . • Manufacturer: WinSystems Inc...bit PC/104 Bus ACRBS 13 Fig. 14 — WinSystems PCM-DC/DC PSM 3.2.3.3 GPS Receiver Module The GPS receiver module is shown in Fig. 15; specification...as the RPC; it is described in Section 3.2.3.1. 3.3.3.2 Power Supply Module The BPC uses the same model PSM as the RPC; it is described in Section

  6. Automatic testing system design and data analysis of permafrost temperature in Qinghai-Tibet Railway

    Institute of Scientific and Technical Information of China (English)

    尚迎春; 齐红元

    2008-01-01

    Aimed at the characteristics of permafrost temperature influencing the safety of Qinghai-Tibet Railway and its on-line testing system, comparing the achievement of permafrost study nationwide with those worldwide, an automatic testing system of permafrost temperature, containing a master computer and some slave computers, was designed. By choosing high-precise thermistors as temperature sensor, designing and positioning the depth and interval of testing sections, testing, keeping and sending permafrost temperature data at time over slave computers, and receiving, processing and analyzing the data of collecting permafrost temperature over master computer, the change of the permafrost temperature can be described and analyzed, which can provide information for permafrost railway engineering design. Moreover, by taking permafrost temperature testing in a certain section of Qinghai-Tibet Railway as an instance, the collected data of permafrost temperature were analyzed, and the effect of permafrost behavior was depicted under the railway, as well as, a BP model was set up to predict the permafrost characteristics. This testing system will provide information timely about the change of the permafrost to support the safety operation in Qinghai-Tibet Railway.

  7. 10 CFR 95.49 - Security of automatic data processing (ADP) systems.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Security of automatic data processing (ADP) systems. 95.49 Section 95.49 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.49 Security...

  8. CARPS An integrated proposal and data collection system

    CERN Document Server

    Brister, K

    2002-01-01

    Modern scripting languages and database tools combined provide a new framework for developing beam-line control and data management software. The CARPS system supports data collection by storing low level beam-line control commands in a database and playing these commands back to collect data sets. This system is combined with proposal and data management tools for support of both local and remote users.

  9. Protokol Interchangeable Data pada VMeS (Vessel Messaging System dan AIS (Automatic Identification System

    Directory of Open Access Journals (Sweden)

    Farid Andhika

    2012-09-01

    Full Text Available VMeS (Vessel Messaging System merupakan komunikasi berbasis radio untuk mengirimkan pesan antara VMeS terminal kapal di laut dengan VMeS gateway di darat. Dalam perkembangan sistem monitoring kapal di laut umumnya menggunakan AIS (Automatic Identification System yang telah digunakan di seluruh pelabuhan untuk memantau kondisi kapal dan mencegah tabrakan antar kapal. Dalam penelitian ini akan dirancang format data yang sesuai untuk VMeS agar bisa dilakukan proses interchangeable ke AIS sehingga bisa dibaca oleh AIS receiver yang ditujukan untuk kapal dengan ukuran dibawah 30 GT (Gross Tonnage. Format data VmeS dirancang dalam tiga jenis yaitu data posisi, data informasi kapal dan data pesan pendek yang akan dilakukan interchangeable dengan AIS tipe 1,4 dan 8. Pengujian kinerja sistem interchangeable menunjukkan bahwa dengan peningkatan periode pengiriman pesan maka lama delay total meningkat tetapi packet loss menurun. Pada pengiriman pesan setiap 5 detik dengan kecepatan 0-40 km/jam, 96,67 % data dapat diterima dengan baik. Data akan mengalami packet loss jika level daya terima dibawah -112 dBm . Jarak terjauh yang dapat dijangkau modem dengan kondisi bergerak yaitu informatika ITS dengan jarak 530 meter terhadap Laboratorium B406 dengan level daya terima -110 dBm.

  10. Mobile In Vivo Infrared Data Collection and Diagnoses Comparison System

    Science.gov (United States)

    Mintz, Frederick W. (Inventor); Moynihan, Philip I. (Inventor); Gunapala, Sarath D. (Inventor)

    2013-01-01

    Described is a mobile in vivo infrared brain scan and analysis system. The system includes a data collection subsystem and a data analysis subsystem. The data collection subsystem is a helmet with a plurality of infrared (IR) thermometer probes. Each of the IR thermometer probes includes an IR photodetector capable of detecting IR radiation generated by evoked potentials within a user's skull. The helmet is formed to collect brain data that is reflective of firing neurons in a mobile subject and transmit the brain data to the data analysis subsystem. The data analysis subsystem is configured to generate and display a three-dimensional image that depicts a location of the firing neurons. The data analysis subsystem is also configured to compare the brain data against a library of brain data to detect an anomaly in the brain data, and notify a user of any detected anomaly in the brain data.

  11. Designing and Building an Automatic Information Retrieval System for Handling the Arabic Data

    Directory of Open Access Journals (Sweden)

    Ibrahiem M.M. El Emary

    2005-01-01

    Full Text Available This paper aimed to design and build an Automatic Information Retrieval System to handle the Arabic data. Also, this paper presents some type of comparison between the retrieval results using the vector space model in two different indexing methods: the full-ward indexing and the root indexing. The proposed Automatic Information Retrieval system was implemented and built using a traditional model technique: Vector Space Model (VSM where the cosine measure similarity was used. The output results indicate and show that the root indexing improved the retrieval performance more than the full-ward indexing on the Arabic documents; furthermore it reduces the size of stored data and minimizes the time of system processing.

  12. Automatic Fare Collection System and Introduction of Contactless IC Card, “Suica"

    Science.gov (United States)

    Uriuhara, Shinsuke

    “Suica" is our contact-less IC-card's nickname: Super Urban Intelligent CArd. There are two types of Suica: 4.5 million Suica Commuter Pass with the stored fare (SF) function and 2 million Suica IO (SF) Cards have already been procured for ticket gate system. Suica can be used without taking the card from the pass-case, and the necessary fare will be extracted from the card automatically at the gate. Suica can be used repeatedly with the re-writing and reloading function. Suica Pass can be re-issued with the remaining stored fare when it is lost. There are 5.84 million Suica holders (about 2.94 million Suica Season Pass holders and 2.9 million Suica IO Card holders) as of 24, March 2003. Average reloading value amount is about 3, 000 Yens. There are almost 3 million transactions (not including the usage of Suica Pass inside the pass zone).

  13. Volunteer-based distributed traffic data collection system

    DEFF Research Database (Denmark)

    Balachandran, Katheepan; Broberg, Jacob Honoré; Revsbech, Kasper;

    2010-01-01

    An architecture for a traffic data collection system is proposed, which can collect data without having access to a backbone network. Contrary to other monitoring systems it relies on volunteers to install a program on their own computers, which will capture incoming and outgoing packets, group...... them into flows and send the flow data to a central server. Data can be used for studying and characterising internet traffic and for testing traffic models by regenerating real traffic. The architecture is designed to have efficient and light usage of resources on both client and server sides. Worst...

  14. Data Collection via Synthetic Aperture Radiometry towards Global System

    Directory of Open Access Journals (Sweden)

    Ali. A. J.Al-Sabbagh

    2015-10-01

    Full Text Available Nowadays it is widely accepted that remote sensing is an efficient way of large data management philosophy. In this paper, we present a future view of the big data collection by synthetic aperture radiometry as a passive microwave remote sensing towards building a global monitoring system. Since the collected data may not have any value, it is mandatory to analyses these data in order to get valuable and beneficial information with respect to their base data. The collected data by synthetic aperture radiometry is one of the high resolution earth observation, these data will be an intensive problems, Meanwhile, Synthetic Aperture Radar able to work in several bands, X, C, S, L and P-band. The important role of synthetic aperture radiometry is how to collect data from areas with inadequate network infrastructures where the ground network facilities were destroyed. The future concern is to establish a new global data management system, which is supported by the groups of international teams working to develop technology based on international regulations. There is no doubt that the existing techniques are so limited to solve big data problems totally. There is a lot of work towards improving 2- D and 3-D SAR to get better resolution.

  15. Calibration of Frequency Data Collection Systems Using Shortwave Radio Signals

    Science.gov (United States)

    Estler, Ron

    2000-09-01

    The atomic-clock-derived audio tones broadcast on the National Institute of Standards and Technology (NIST) shortwave station WWV are used to calibrate computer frequency data collection systems via Fast Fourier Transforms (FFT). Once calibrated, the data collection system can be used to accurately determine the audio signals used in several instructional physical chemistry laboratory experiments. This method can be applied to virtually any hardware-software configuration that allows adjustment of the apparent time scale (digitizing rate) of the recorded audio file.

  16. Automatic decision support system based on SAR data for oil spill detection

    Science.gov (United States)

    Mera, David; Cotos, José M.; Varela-Pet, José; Rodríguez, Pablo G.; Caro, Andrés

    2014-11-01

    Global trade is mainly supported by maritime transport, which generates important pollution problems. Thus, effective surveillance and intervention means are necessary to ensure proper response to environmental emergencies. Synthetic Aperture Radar (SAR) has been established as a useful tool for detecting hydrocarbon spillages on the oceans surface. Several decision support systems have been based on this technology. This paper presents an automatic oil spill detection system based on SAR data which was developed on the basis of confirmed spillages and it was adapted to an important international shipping route off the Galician coast (northwest Iberian Peninsula). The system was supported by an adaptive segmentation process based on wind data as well as a shape oriented characterization algorithm. Moreover, two classifiers were developed and compared. Thus, image testing revealed up to 95.1% candidate labeling accuracy. Shared-memory parallel programming techniques were used to develop algorithms in order to improve above 25% of the system processing time.

  17. Dynamic Data Driven Applications Systems (DDDAS) modeling for automatic target recognition

    Science.gov (United States)

    Blasch, Erik; Seetharaman, Guna; Darema, Frederica

    2013-05-01

    The Dynamic Data Driven Applications System (DDDAS) concept uses applications modeling, mathematical algorithms, and measurement systems to work with dynamic systems. A dynamic systems such as Automatic Target Recognition (ATR) is subject to sensor, target, and the environment variations over space and time. We use the DDDAS concept to develop an ATR methodology for multiscale-multimodal analysis that seeks to integrated sensing, processing, and exploitation. In the analysis, we use computer vision techniques to explore the capabilities and analogies that DDDAS has with information fusion. The key attribute of coordination is the use of sensor management as a data driven techniques to improve performance. In addition, DDDAS supports the need for modeling from which uncertainty and variations are used within the dynamic models for advanced performance. As an example, we use a Wide-Area Motion Imagery (WAMI) application to draw parallels and contrasts between ATR and DDDAS systems that warrants an integrated perspective. This elementary work is aimed at triggering a sequence of deeper insightful research towards exploiting sparsely sampled piecewise dense WAMI measurements - an application where the challenges of big-data with regards to mathematical fusion relationships and high-performance computations remain significant and will persist. Dynamic data-driven adaptive computations are required to effectively handle the challenges with exponentially increasing data volume for advanced information fusion systems solutions such as simultaneous target tracking and ATR.

  18. Automatic layout of ventilation systems by means of electronic data processing

    Energy Technology Data Exchange (ETDEWEB)

    Altena, H.; Priess, H.; Fries, E.; Hoffmann, G.

    1982-12-09

    A working group developed a mehtod for the automatic projection of ventilation systems by means of electronic data processing. The purpose of this was to increase the information content of this document and to obtain a useful tool for ventilation planning while reducing the efforts required for elaboration of ventilation plans. A program system was developed by means of which ventilation plans can be plotted in consideration of the regulations set by the mining authorities. The program system was applied for the first time at Osterfeld mine. The plan is clearly organized, accurate, and easy to understand. This positive experience suggests that computer-aided plans should be more widely applied. The mining authorities support this view.

  19. Web-based electronic data collection system to support electrochemotherapy clinical trial.

    Science.gov (United States)

    Pavlović, Ivan; Miklavcic, Damijan

    2007-03-01

    Many branches of the healthcare industry are being influenced by information and communication technology (ICT). Clinical trials are not an exception. Despite this fact, more than 75% of clinical trials data are being collected on paper records. Recent ICT advances, such as broad acceptance of Internet Technology which are rapidly improving electronic data collection (EDC) tools, however, may soon reduce this percentage of "paper" supported clinical trials. In this paper, we present our Web-based EDC system designed to support a small-scale research-oriented clinical trial for establishing standard operating procedures (SOP) for electrochemotherapy with a new medical device, named Cliniporator. The definition of the SOP can only be based on a comprehensive analysis of collected data and results of clinical trial. Therefore, it is necessary to record treatment efficiency and, in this respect, to carefully follow and collect treatment parameters. We thus established central database and the Web application for filling database with data submitted by users from distant medical centers across Europe. Also, we enabled transmitting of data stored on the local Cliniporator medical devices to the central database as well as submitting of tumor images and marking of tumor nodules on interactive human map developed in Macromedia Flash. We provided users with dynamically generated basic statistics, and, several times during data collection process, we performed statistical data analysis. In order to assure high quality of data in a database, we included several mechanisms: automatic data validation, digital signatures, the form completeness notification system, e-mail alerting of completed forms, and "check tables." After 13 months of using the systems, we performed a simple usability evaluation of the system by asking users to answer to a questionnaire, and here we present the results. With this paper, we try to share our experience and encourage others to exploit Internet

  20. Automatic Classification of the Vestibulo-Ocular Reflex Nystagmus: Integration of Data Clustering and System Identification.

    Science.gov (United States)

    Ranjbaran, Mina; Smith, Heather L H; Galiana, Henrietta L

    2016-04-01

    The vestibulo-ocular reflex (VOR) plays an important role in our daily activities by enabling us to fixate on objects during head movements. Modeling and identification of the VOR improves our insight into the system behavior and improves diagnosis of various disorders. However, the switching nature of eye movements (nystagmus), including the VOR, makes dynamic analysis challenging. The first step in such analysis is to segment data into its subsystem responses (here slow and fast segment intervals). Misclassification of segments results in biased analysis of the system of interest. Here, we develop a novel three-step algorithm to classify the VOR data into slow and fast intervals automatically. The proposed algorithm is initialized using a K-means clustering method. The initial classification is then refined using system identification approaches and prediction error statistics. The performance of the algorithm is evaluated on simulated and experimental data. It is shown that the new algorithm performance is much improved over the previous methods, in terms of higher specificity.

  1. Development of teacher schedule automatic collection system based on Visual Basic%基于Visual Basic的教师课表自动汇总系统开发

    Institute of Scientific and Technical Information of China (English)

    刘信香

    2012-01-01

    In this paper, according to the need of practice, a set of teachers' schedules automatic collection system is developed based on Visual Basic program. The sehedule data can be read automatically, and the total schedule file can be generated automatically after the data be collected in this system. This system can replace the artificial tedious duplication of effort, with high efficiency, and convenient.%根据实际工作的需要,开发了一套基于Visual Basic程序的教师课表自动汇总系统。该系统可自动读取课表数据,并将读取的数据汇总后自动生成总课表文件,可代替人工的繁琐重复劳动,具有效率高、方便快捷的特点。

  2. 输变电设备状态监测数据自动汇集技术研究与应用%Automatic Collection Technology for State Monitoring Data of Power Transmission and Transformation Equipment and Its Application

    Institute of Scientific and Technical Information of China (English)

    杨强; 谢善益; 马金宝; 范颖

    2014-01-01

    In order to establish an unified and standardized data platform for electric power equipment state monitoring and diagnosis system,a method which was based on unified information model and aimed to realize automatic collection for state monitoring data of various professional systems was studied.Firstly,a solution scheme for object coding of various data source systems and inconformity of object naming was put forward and an verification method for source system model was presented as well.Process steps of automatic data collection were stated and meanwhile,methods for automatic joint of pro-duction management system and information model of dispatching automation system,construction of online monitoring model,real-time data and alarming data collection were introduced and analyzed.Automatic collection technology was ap-plied in Guangdong Power Grid Co.,Ltd.which was effectively realizing automatic collection,comprehensive integration and unified application of mass data of production management system,energy management system and online monitoring system.%为建立电力设备状态监测诊断系统的统一化、标准化数据平台,研究了基于统一信息模型,实现来源于各专业系统的状态监测数据自动汇集的方法。首先提出各数据源系统的对象编码及对象命名不一致问题的解决方案,以及对源系统模型进行校验的方法;阐述数据自动汇集的过程步骤,介绍和分析生产管理系统与调度自动化系统信息模型自动拼接、在线监测模型创建、实时数据及告警数据汇集的方法。自动汇集技术已应用于广东电网公司,实现了广东电网生产管理系统、能量管理系统、在线监测系统的海量数据的自动汇集、全面融合、统一应用。

  3. Automatic multi-modal intelligent seizure acquisition (MISA) system for detection of motor seizures from electromyographic data and motion data

    DEFF Research Database (Denmark)

    Conradsen, Isa; Beniczky, Sándor; Wolf, Peter

    2012-01-01

    The objective is to develop a non-invasive automatic method for detection of epileptic seizures with motor manifestations. Ten healthy subjects who simulated seizures and one patient participated in the study. Surface electromyography (sEMG) and motion sensor features were extracted as energy...... of the seizure from the patient showed that the simulated seizures were visually similar to the epileptic one. The multi-modal intelligent seizure acquisition (MISA) system showed high sensitivity, short detection latency and low false detection rate. The results showed superiority of the multi- modal detection...... system compared to the uni-modal one. The presented system has a promising potential for seizure detection based on multi-modal data....

  4. Automatically Collecting and Monitoring Japanese Weblogs

    Science.gov (United States)

    Nanno, Tomoyuki; Suzuki, Yasuhiro; Fujiki, Toshiaki; Okumura, Manabu

    Weblogs (blogs) are now thought of as a potentially useful information source. Although the definition of blogs is not necessarily definite, it is generally understood that they are personal web pages authored by a single individual and made up of a sequence of dated entries of the author's thoughts, that are arranged chronologically. In Japan, since long before blog software became available, people have written `diaries' on the web. These web diaries are quite similar to blogs in their content, and people still write them without any blog software. As we will show, hand-edited blogs are quite numerous in Japan, though most people now think of blogs as pages usually published using one of the variants of public-domain blog software. Therefore, it is quite difficult to exhaustively collect Japanese blogs, i.e., collect blogs made with blog software and web diaries written as normal web pages. With this as the motivation for our work, we present a system that tries to automatically collect and monitor Japanese blog collections that include not only ones made with blog software but also ones written as normal web pages. Our approach is based on extraction of date expressions and analysis of HTML documents, to avoid having to depend on specific blog software, RSS, or the ping server.

  5. Cyber security and data collection approaches for smartphone sensor systems

    Science.gov (United States)

    Turner, Hamilton; White, Jules

    2012-06-01

    In recent years the ubiquity and resources provided by smartphone devices have encouraged scientists to explore using these devices as remote sensing nodes. In addition, the United States Department of Defense has stated a mission of increasing persistent intelligence, surveillance, and reconnaissance capabilities or U.S. units. This paper presents a method of enabling large-scale, long-term smartphone-powered data collection. Key solutions discussed include the ability to directly allow domain experts to define and refine smartphone applications for data collection, technical advancements that allow rapid dissemination of a smartphone data collection application, and an algorithm for preserving the locational privacy of participating users.

  6. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    Science.gov (United States)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  7. Collection and evaluation of salt mixing data with the real time data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Glazer, S.; Chiu, C.; Todreas, N.E.

    1977-09-01

    A minicomputer based real time data acquisition system was designed and built to facilitate data collection during salt mixing tests in mock ups of LMFBR rod bundles. The system represents an expansion of data collection capabilities over previous equipment. It performs steady state and transient monitoring and recording of up to 512 individual electrical resistance probes. Extensive real time software was written to govern all phases of the data collection procedure, including probe definition, probe calibration, salt mixing test data acquisition and storage, and data editing. Offline software was also written to permit data examination and reduction to dimensionless salt concentration maps. Finally, the computer program SUPERENERGY was modified to permit rapid extraction of parameters from dimensionless salt concentration maps. The document describes the computer system, and includes circuit diagrams of all custom built components. It also includes descriptions and listings of all software written, as well as extensive user instructions.

  8. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  9. JAPS: an automatic parallelizing system based on JAVA

    Institute of Scientific and Technical Information of China (English)

    杜建成; 陈道蓄; 谢立

    1999-01-01

    JAPS is an automatic parallelizing system based on JAVA running on NOW. It implements the automatic process from dependence analysis to parallel execution. The current version of JAPS can exploit functional parallelism and the detection of data parallelism will be incorporated in the new version, which is underway. The framework and key techniques of JAPS are presented. Specific topics discussed are task partitioning, summary information collection, data dependence analysis, pre-scheduling and dynamic scheduling, etc.

  10. Resource allocatiion: sequential data collection for reliability analysis involving systems and component level data

    Energy Technology Data Exchange (ETDEWEB)

    Anderson-cooke, Christine M [Los Alamos National Laboratory

    2010-01-01

    In analyzing the reliability of complex systems, several types of data from full-system tests to component level tests are commonly available and are used. After a preliminary analysis, additional resources may be available to collect new data. The goal of resource allocation is to identify the best new data to collect to maximally improve the prediction of system reliability. While several possible definitions of 'maximally improve' are possible, we focus on reducing the uncertainty or the width of the uncertainty interval for the prediction of system reliability at a user-specified age(s). In this paper, we present an algorithm that allows us to estimate the anticipated improvement to the analysis with the addition of new data, based on current understanding of all of the statistical model parameters. This quantitative assessment of the anticipated improvement can be helpful to justify the benefits of collecting new data. Additionally by comparing different potential allocations, it is possible to determine what new data should be collected to improve our understanding of the response. This optimization takes into account the relative cost of different data types and can be based on flexible allocation options, or subject to logistical constraints.

  11. Automatic Tagging and Geotagging in Video Collections and Communities

    NARCIS (Netherlands)

    Larson, Martha; Soleymani, Mohammad; Serdyukov, Pavel; Rudinac, Stevan; Wartena, Christian; Murdock, Vanessa; Friedland, Gerald; Ordelman, Roeland; Jones, Gareth J.F.

    2011-01-01

    Automatically generated tags and geotags hold great promise to improve access to video collections and online communi- ties. We overview three tasks oered in the MediaEval 2010 benchmarking initiative, for each, describing its use scenario, denition and the data set released. For each task, a refer-

  12. A Customizable and Expandable Electroencephalography (EEG) Data Collection System

    Science.gov (United States)

    2016-03-01

    wireless transmission. 15. SUBJECT TERMS EEG, data acquisition, high-rate data update, wireless transmission, embedded system 16. SECURITY...by ANSI Std. Z39.18 Approved for public release; distribution is unlimited. iii Contents List of Figures iv 1. Introduction 1 1.1...individual channel setting tab ....................4 Fig. 3 LabVIEW front panel with register configuration tab ...........................5 Fig. 4

  13. Archival Automatic Identification System (AIS) Data for Navigation Project Performance Evaluation

    Science.gov (United States)

    2015-08-01

    and available to USACE practitioners via the MOU mentioned above provides several of these parameters at a cost that is significantly lower than...performance information can be screened for a variety of embedded factors in the context of navigation features, such as inbound or outbound vessels. Vessel...collection, yet AIS data provides triple the data volume for this single transit, with no explicit cost incurred. Each historical data request from

  14. 15 CFR 911.4 - Use of the NOAA Data Collection Systems.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Use of the NOAA Data Collection... POLICIES AND PROCEDURES CONCERNING USE OF THE NOAA SPACE-BASED DATA COLLECTION SYSTEMS § 911.4 Use of the NOAA Data Collection Systems. (a) Use of the NOAA DCS will only be authorized in accordance with...

  15. The Italian information system on zoonoses data collection.

    Science.gov (United States)

    Colangeli, P; Iannetti, S; Ruocco, L; Forlizzi, L; Cioci, D; Calistri, P

    2013-03-01

    In the framework of the international obligations subscribed by the Italian government, the Italian Ministry of Health should provide the European Union (EU) (European Commission, European Food Safety Authority - EFSA) with a set of data and information related to the report and the spread of zoonoses and to the activities put in place for monitoring and control of zoonoses. In 2008, the Italian Ministry of Health commissioned the Istituto G. Caporale (ICT) to implement an information system able to provide information and data on the monitoring and control of zoonoses in the national territory, in accordance with the national and community legislation. The system is part of the e-Government process that involves all public administrations of the EU and refers to the use of information and communication technologies for the digital processing of documents in order to obtain simplification and interoperability of administrative procedures through the Internet, as defined in the strategic lines published by the National Centre for Information Systems in Public Administration (DigitPA) in 2009-2011.

  16. OPC model data collection for 45-nm technology node using automatic CD-SEM offline recipe creation

    Science.gov (United States)

    Fischer, Daniel; Talbi, Mohamed; Wei, Alex; Menadeva, Ovadya; Cornell, Roger

    2007-03-01

    Optical and Process Correction in the 45nm node is requiring an ever higher level of characterization. The greater complexity drives a need for automation of the metrology process allowing more efficient, accurate and effective use of the engineering resources and metrology tool time in the fab, helping to satisfy what seems an insatiable appetite for data by lithographers and modelers charged with development of 45nm and 32nm processes. The scope of the work referenced here is a 45nm design cycle "full-loop automation", starting with gds formatted target design layout and ending with the necessary feedback of one and two dimensional printed wafer metrology. In this paper the authors consider the key elements of software, algorithmic framework and Critical Dimension Scanning Electron Microscope (CDSEM) functionality necessary to automate its recipe creation. We evaluate specific problems with the methodology of the former art, "on-tool on-wafer" recipe construction, and discuss how the implementation of the design based recipe generation improves upon the overall metrology process. Individual target-by-target construction, use of a one pattern recognition template fits all approach, a blind navigation to the desired measurement feature, lengthy sessions on tool to construct recipes and limited ability to determine measurement quality in the resultant data set are each discussed as to how the state of the art Design Based Metrology (DBM) approach is implemented. The offline created recipes have shown pattern recognition success rates of up to 100% and measurement success rates of up to 93% for line/space as well as for 2D Minimum/Maximum measurements without manual assists during measurement.

  17. Creating a portable data-collection system with Microsoft Embedded Visual Tools for the Pocket PC.

    Science.gov (United States)

    Dixon, Mark R

    2003-01-01

    This paper describes an overview and illustrative example for creating a portable data-collection system using Microsoft Embedded Visual Tools for the Pocket PC. A description of the Visual Basic programming language is given, along with examples of computer code procedures for developing data-collection software. Program specifications, strategies for customizing the collection system, and troubleshooting tips are also provided.

  18. Scheme of the Saik computer system of complex automatic interpretation of well logging data

    Energy Technology Data Exchange (ETDEWEB)

    Frydecki, J.

    1975-01-01

    The basis of the first Polish interpretation system is the method of ''autocalibration.'' The input data for the system Saik are noncalibrated gamma log, neutron log, resistivity log, and caliper log. The input data are autocalibrated by means of correlation cross-plotting of gamma versus neutron, gamma versus resistivity and resistivity versus neutron logs. The results of processing are tables and curves reflecting clay content, porosity and hydrocarbon saturation.

  19. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive manua

  20. Comparison of automatic control systems

    Science.gov (United States)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  1. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  2. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. Curating Virtual Data Collections

    Science.gov (United States)

    Lynnes, Chris; Leon, Amanda; Ramapriyan, Hampapuram; Tsontos, Vardis; Shie, Chung-Lin; Liu, Zhong

    2015-01-01

    NASAs Earth Observing System Data and Information System (EOSDIS) contains a rich set of datasets and related services throughout its many elements. As a result, locating all the EOSDIS data and related resources relevant to particular science theme can be daunting. This is largely because EOSDIS data's organizing principle is affected more by the way they are produced than around the expected end use. Virtual collections oriented around science themes can overcome this by presenting collections of data and related resources that are organized around the user's interest, not around the way the data were produced. Virtual collections consist of annotated web addresses (URLs) that point to data and related resource addresses, thus avoiding the need to copy all of the relevant data to a single place. These URL addresses can be consumed by a variety of clients, ranging from basic URL downloaders (wget, curl) and web browsers to sophisticated data analysis programs such as the Integrated Data Viewer.

  4. 15 CFR 911.7 - Continuation of the NOAA Data Collection Systems.

    Science.gov (United States)

    2010-01-01

    ... REGULATIONS POLICIES AND PROCEDURES CONCERNING USE OF THE NOAA SPACE-BASED DATA COLLECTION SYSTEMS § 911.7 Continuation of the NOAA Data Collection Systems. (a) NOAA expects to continue to operate DCS on its... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Continuation of the NOAA...

  5. Real time Aanderaa current meter data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G.

    in laboratory. In this paper a method is described to read the real time current meter data and display/print/store on cartridge. For this, binary coded electrical signal available at the top end plate of the current meter is connectEd. by underwater cable...

  6. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Directory of Open Access Journals (Sweden)

    T. A. Boden

    2013-02-01

    Full Text Available The Carbon Dioxide Information Analysis Center (CDIAC at Oak Ridge National Laboratory (ORNL, USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP based data-interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent

  7. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Directory of Open Access Journals (Sweden)

    T. A. Boden

    2013-06-01

    Full Text Available The Carbon Dioxide Information Analysis Center (CDIAC at Oak Ridge National Laboratory (ORNL, USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent

  8. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Science.gov (United States)

    Boden, T. A.; Krassovski, M.; Yang, B.

    2013-06-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database

  9. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  10. Special Data Collection System (SDCS) event report, Southern Sinkiang Province, 27 October 1975

    Energy Technology Data Exchange (ETDEWEB)

    Hill, K.J.; Dawkins, M.S.; Baumstark, R.R.; Gillespie, M.D.

    1976-01-13

    A report is given of seismic data from the Special Data Collection System (SDCS) and other sources for the Southern Sinkiang Province event on 27 October 1975. Published epicenter information from seismic observations is given.

  11. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  12. Design and implementation of security in a data collection system for epidemiology.

    Science.gov (United States)

    Ainsworth, John; Harper, Robert; Juma, Ismael; Buchan, Iain

    2006-01-01

    Health informatics can benefit greatly from the e-Science approach, which is characterised by large scale distributed resource sharing and collaboration. Ensuring the privacy and confidentiality of data has always been the first requirement of health informatics systems. The PsyGrid data collection system, addresses both, providing secure distributed data collection for epidemiology. We have used Grid-computing approaches and technologies to address this problem. We describe the architecture and implementation of the security sub-system in detail.

  13. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  14. Curating Virtual Data Collections

    Science.gov (United States)

    Lynnes, C.; Ramapriyan, H.; Leon, A.; Tsontos, V. M.; Liu, Z.; Shie, C. L.

    2015-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) contains a rich set of datasets and related services throughout its many elements. As a result, locating all the EOSDIS data and related resources relevant to particular science theme can be daunting. This is largely because EOSDIS data's organizing principle is affected more by the way they are produced than around the expected end use.Virtual collections oriented around science themes can overcome this by presenting collections of data and related resources that are organized around the user's interest, not around the way the data were produced. Science themes can be: Specific applications (uses) of the data, e.g., landslide prediction Geophysical events (e.g., Hurricane Sandy) A specific science research problem Virtual collections consist of annotated web addresses (URLs) that point to data and related resource addresses, thus avoiding the need to copy all of the relevant data to a single place. These URL addresses can be consumed by a variety of clients, ranging from basic URL downloaders (wget, curl) and web browsers to sophisticated data analysis programs such as the Integrated Data Viewer. Eligible resources include anything accessible via URL: data files: data file URLs data subsets: OPeNDAP, webification or Web Coverage Service URLs data visualizations: Web Map Service data search results: OpenSearch Atom response custom analysis workflows: e.g., Giovanni analysis URL

  15. A novel electronic data collection system for large-scale surveys of neglected tropical diseases.

    Directory of Open Access Journals (Sweden)

    Jonathan D King

    Full Text Available BACKGROUND: Large cross-sectional household surveys are common for measuring indicators of neglected tropical disease control programs. As an alternative to standard paper-based data collection, we utilized novel paperless technology to collect data electronically from over 12,000 households in Ethiopia. METHODOLOGY: We conducted a needs assessment to design an Android-based electronic data collection and management system. We then evaluated the system by reporting results of a pilot trial and from comparisons of two, large-scale surveys; one with traditional paper questionnaires and the other with tablet computers, including accuracy, person-time days, and costs incurred. PRINCIPLE FINDINGS: The electronic data collection system met core functions in household surveys and overcame constraints identified in the needs assessment. Pilot data recorders took 264 (standard deviation (SD 152 sec and 260 sec (SD 122 sec per person registered to complete household surveys using paper and tablets, respectively (P = 0.77. Data recorders felt a lack of connection with the interviewee during the first days using electronic devices, but preferred to collect data electronically in future surveys. Electronic data collection saved time by giving results immediately, obviating the need for double data entry and cross-correcting. The proportion of identified data entry errors in disease classification did not differ between the two data collection methods. Geographic coordinates collected using the tablets were more accurate than coordinates transcribed on a paper form. Costs of the equipment required for electronic data collection was approximately the same cost incurred for data entry of questionnaires, whereas repeated use of the electronic equipment may increase cost savings. CONCLUSIONS/SIGNIFICANCE: Conducting a needs assessment and pilot testing allowed the design to specifically match the functionality required for surveys. Electronic data collection

  16. Trevi Park: Automatic Parking System

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    TreviPark is an underground, multi-story stacking system that holds cars efficiently, thus reducing the cost of each parking space, as a fully automatic parking system intended to maximize space utilization in parking structures. TreviPark costs less than the price of a conventional urban garage and takes up half the volume and 80% of the depth.

  17. Air Force Genomics, Proteomics, Bioinformatics System, DataCap-Data Collection Module. Phase 1: Development

    Science.gov (United States)

    2004-07-01

    exist as a series of isolated computational silos, providing a depth of data in a narrow field of research. The Acero Genomics Knowledge Platform (GKP...on top of the Acero Platform. The purpose of the DataCap is to provide the individual researcher with the ability to collect experimental data in a...integrated format compatible with the Acero GKP. This technical report covers the architecture, the design and the operation of the DataCap in its

  18. Using sensor data patterns from an automatic milking system to develop predictive variables for classifying clinical mastitis and abnormal milk

    NARCIS (Netherlands)

    Kamphuis, A.; Pietersma, D.; Tol, van der R.; Wiedermann, M.; Hogeveen, H.

    2008-01-01

    Dairy farmers using automatic milking are able to manage mastitis successfully with the help of mastitis attention lists. These attention lists are generated with mastitis detection models that make use of sensor data obtained throughout each quarter milking. The models tend to be limited to using t

  19. Designing automatic resupply systems.

    Science.gov (United States)

    Harding, M L

    1999-02-01

    This article outlines the process for designing and implementing autoresupply systems. The planning process includes determination of goals and appropriate participation. Different types of autoresupply mechanisms include kanban, breadman, consignment, systems contracts, and direct shipping from an MRP schedule.

  20. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration

    2017-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence ...

  1. Automatic TLI recognition system, general description

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  2. PS3-05: Electronic Data Collection for a Clinical Trial Conducted within a Health System

    Science.gov (United States)

    Kerby, Tessa; Schneider, Nicole; Asche, Stephen; Loes, Linda; Maciosek, Michael; Meyers, Peter; Michalski, Derek; Margolis, Karen

    2010-01-01

    Background/Aims: Designing a seamless data collection tool for a research study across multiple physical locations within a health care system is challenging. Paper-based data collection is prone to data entry errors, subject to delays in availability of data, and environmentally wasteful. Web-based data collection tools are costly, time consuming to produce, and have security issues. We used Microsoft Access to create an efficient, low-cost electronic data collection tool for a clinical trial that required availability at numerous locations in the HealthPartners system. Methods: The research study required data collection entry points at ten different locations for different types of users, all linked into the HealthPartners computer network. A single Access database with linked modules for recruitment, tracking, eligibility determination and data collection was designed. Results: The recruitment module used at the research department integrated data on recruitment mailings and telephone screening of interested respondents, and used an automated algorithm to perform eligibility checks. The research clinic module for clinic visits was populated with eligible participants as determined by the recruitment module. This clinic module included further eligibility checks, data collection and treatment assignment. Participants then became available in the intervention module to pharmacist case managers located at 8 HealthPartners primary care clinics to collect data for the intervention. Based on study entry date, Access created a visit log to aid the case managers with timely adherence to the trial protocol. The database is stored on a secure server that is accessible only to authorized study team members, with further restrictions on data entry and access determined by study team role. Conclusions: The disadvantages of Microsoft Access (lack of flexibility, “bugginess”, older technology) are counterbalanced by many advantages (ready availability, inexpensiveness

  3. Automatic classification of municipal call data for quantitative urban drainage system analysis

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.; Harder, R.C.; Loog, M.

    2010-01-01

    Flooding in urban areas can be caused by heavy rainfall, improper planning or component failures. Quantification of these various causes to urban flood probability supports prioritisation of flood risk reduction measures. In many cases, a lack of data on flooding incidents impedes quantification of

  4. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas

    2016-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for over a year. However, with the forthcoming start of run-2 and its expected workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing data imbalances as well as storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only ...

  5. An automatic registration system of multi-view 3D measurement data using two-axis turntables

    Science.gov (United States)

    He, Dong; Liu, Xiaoli; Cai, Zewei; Chen, Hailong; Peng, Xiang

    2016-09-01

    Automatic registration is a key researcher issue in 3D measurement field. In this work, we developed the automatic registration system, which is composed of a stereo system with structured light and two axis turntables. To realize the fully automatically 3D point registration, the novel method is proposed for calibration the stereo system and the two turntable direction vector simultaneously. The plane calibration rig with marked points was placed on the turntable and was captured by the left and right cameras of the stereo system with different rotation angles of the two axis turntable. By the shot images, a stereo system (intrinsically and extrinsically) was calibrated with classics camera model, and reconstruction 3D coordinates of the marked points with different angle of the two turntable. The marked point in different angle posted the specific circle, and the normal line of the circle around the turntable axis direction vector. For the each turntable, different points have different circle and normal line, and the turntable axis direction vector is calculated by averaging the different normal line. And the result show that, the proposed registration system can precisely register point cloud under the different scanning angles. In addition, there are no the ICP iterative procedures, and that make it can be used in registration of the point cloud without the obvious features like sphere, cylinder comes and the other rotator.

  6. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  7. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    Full Text Available Purpose of the study. The scientific and educational organizations use traditionally e-mail with Microsoft Excel spreadsheets and Microsoft Word documents for operational data collection. The disadvantages of this approach include the lack of control of the correctness of the data input, the complexity of processing the information received due to non-relational data model, etc. There are online services that enable to organize the collection of data in a relational form. The disadvantages of these systems are: the absence of thesaurus support; a limited set of elements of data input control; the limited control the operation of the input form; most of the systems is shareware, etc. Thus, it is required the development of Internet data collection and analysis technology, which should allow to identify quickly model the data collected and automatically implement data collection in accordance with this model.Materials and methods. The article describes the technology developed and tested for operational data collection and analysis using "Faramant" system. System operation "Faramant" is based on a model document, which includes three components: description of the data structure; visualization; logic of form work. All stages of the technology are performed by the user using the browser. The main stage of the proposed technology is the definition of the data model as a set of relational tables. To create a table within the system it’s required to determine the name and a list of fields. For each field, you must specify its name and use the control to the data input and logic of his work. Controls are used to organize the correct input data depending on the data type. Based on a model system "Faramant" automatically creates a filling form, using which users can enter information. To change the form visualization, you can use the form template. The data can be viewed page by page in a table. For table rows, you can apply different filters. To

  8. System for Control,Data Collection and Processing in 8 mm Portable Microwave Radiometer—Scatterometer

    Institute of Scientific and Technical Information of China (English)

    李毅; 方振和; 等

    2002-01-01

    In this paper we describe a system used to control,collect and process data in 8mm portable microwave radiometer-scatterometer,We focus on hardware and software design of the system based on a PIC16F874 chip.The system has been successfully used in an 8mm portable microwave radiometer-scatterometer,compared with other similar systems,the system modularization miniatureization and intelligentization are improved so as to meet portable instrument requirements.

  9. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  10. Automatic Guidance System for Welding Torches

    Science.gov (United States)

    Smith, H.; Wall, W.; Burns, M. R., Jr.

    1984-01-01

    Digital system automatically guides welding torch to produce squarebutt, V-groove and lap-joint weldments within tracking accuracy of +0.2 millimeter. Television camera observes and traverses weld joint, carrying welding torch behind. Image of joint digitized, and resulting data used to derive control signals that enable torch to track joint.

  11. Learning Diagnostic Diagrams in Transport-Based Data-Collection Systems

    DEFF Research Database (Denmark)

    Tran, Vu The; Eklund, Peter; Cook, Chris

    2014-01-01

    Insights about service improvement in a transit network can be gained by studying transit service reliability. In this paper, a general procedure for constructing a transit service reliability diagnostic (Tsrd) diagram based on a Bayesian network is proposed to automatically build a behavioural...... model from Automatic Vehicle Location (AVL) and Automatic Passenger Counters (APC) data. Our purpose is to discover the variability of transit service attributes and their effects on traveller behaviour. A Tsrd diagram describes and helps to analyse factors affecting public transport by combining domain...

  12. Multiparameter System for Monitoring the State of Urbanospherum Based on Multivendor Devices of Data Collection

    Directory of Open Access Journals (Sweden)

    Pilipenko Aleksandr

    2016-01-01

    Full Text Available The authors explain the problem of partitioning of the control object on the controlled elements, and give a description of the mathematical apparatus. The authors propose different types of portable devices for collecting information about the state of urbanospherum on the basis of different hardware and software platforms that combine integrated information system. In the work the presented algorithms and block diagrams of data collection devices are based on the controller NI MyRIO, micro-computer Raspberry Pi and Arduino microcontroller system. The authors explain the approach to the optimization of systems of management of urban resources and processes with the using of systems condition monitoring of urbanospherum.

  13. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  14. ON GEOMETRIC PROCESSING OF MULTI-TEMPORAL IMAGE DATA COLLECTED BY LIGHT UAV SYSTEMS

    Directory of Open Access Journals (Sweden)

    T. Rosnell

    2012-09-01

    Full Text Available Data collection under highly variable weather and illumination conditions around the year will be necessary in many applications of UAV imaging systems. This is a new feature in rigorous photogrammetric and remote sensing processing. We studied performance of two georeferencing and point cloud generation approaches using image data sets collected in four seasons (winter, spring, summer and autumn and under different imaging conditions (sunny, cloudy, different solar elevations. We used light, quadrocopter UAVs equipped with consumer cameras. In general, matching of image blocks collected with high overlaps provided high quality point clouds. All of the before mentioned factors influenced the point cloud quality. In winter time, the point cloud generation failed on uniform snow surfaces in many situations, and during leaf-off season the point cloud generation was not successful over deciduous trees. The images collected under cloudy conditions provided better point clouds than the images collected in sunny weather in shadowed regions and of tree surfaces. On homogeneous surfaces (e.g. asphalt the images collected under sunny conditions outperformed cloudy data. The tested factors did not influence the general block adjustment results. The radiometric sensor performance (especially signal-to-noise ratio is a critical factor in all weather data collection and point cloud generation; at the moment, high quality, light weight imaging sensors are still largely missing; sensitivity to wind is another potential limitation. There lies a great potential in low flying, low cost UAVs especially in applications requiring rapid aerial imaging for frequent monitoring.

  15. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    The commutated automatic gain control (AGC) system was designed and built for the prototype Loran-C receiver is discussed. The current version of the prototype receiver, the Mini L-80, was tested initially in 1980. The receiver uses a super jolt microcomputer to control a memory aided phase loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The AGC control adjusts the level of each station signal, such that the early portion of each envelope rise is about at the same amplitude in the receiver envelope detector.

  16. Design of high-speed data collecting system for pipeline magnetic flux leakage inspection

    Science.gov (United States)

    Qu, Weidong; Xu, Hongbing

    2013-03-01

    In the process of high-speed magnetic flux leakage inspection, a large amount of signals containing pipeline status, characteristics and defects need to be collected and transmitted. High sampling frequency, high real-time, high precision and intellectualization for data collecting are long-term trend. This paper proposed a new design method of high speed data collecting system. In the system, dual CPU structure is applied in main controller and the high-speed communication between DSP and MCU is realized through dual-port RAM technology. With the Dual CPU structure being applied, the measurement, control and the communication are charged by two CPUs respectively, which can improve the system adaptability. Dual-port RAM can raise communication efficiency by improving the parallel processing ability among different type of multiprocessors.

  17. Automatic Control System for Neutron Laboratory Safety

    Institute of Scientific and Technical Information of China (English)

    ZHAO; Xiao; ZHANG; Guo-guang; FENG; Shu-qiang; SU; Dan; YANG; Guo-zhao; ZHANG; Shuai

    2015-01-01

    In order to cooperate with the experiment of neutron generator,and realize the automatic control in the experiment,a set of automatic control system for the safety of the neutron laboratory is designed.The system block diagram is shown as Fig.1.Automatic control device is for processing switch signal,so PLC is selected as the core component

  18. Research on Chinese Antarctic Data Directory System I——Collecting, processing, examining and submitting data directory

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the general framework of ADDS (Antarctic Data Directory System) established by SCAR-COMNAP ad hoc Planning Group on Antarctic data management, the CN—ADDS (Chinese Antarctic Data Directory System ) project is going on, of which the research and activity keeps to the available method and technique in ADDS development and allows for the Chinese specific status in Antarctic data management as well. At present, authoring and submitting timely Antarctic data directory in China is one of the key issues that is to be dealt with necessarily. This paper aims at studying the technical procedure in collecting, processing, examining and submitting data directory. In additional, it also discusses the efficient collection of data directory, which needs the effort of administrative and technical support

  19. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.

    2009-09-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

  20. Automatic recovery from resource exhaustion exceptions by collecting leaked resources

    Institute of Scientific and Technical Information of China (English)

    Zi-ying DAI; Xiao-guang MAO; Li-qian CHEN; Yan LEI

    2014-01-01

    Despite the availability of garbage collectors, programmers must manually manage non-memory fi nite system resources such as fi le descriptors. Resource leaks can gradually consume all available resources and cause programs to raise resource exhaustion exceptions. However, programmers commonly provide no effective recovery approach for resource exhaustion exceptions, which often causes programs to halt without completing their tasks. In this paper, we propose to automatically recover programs from resource exhaustion exceptions caused by resource leaks. We transform programs to catch resource exhaustion exceptions, collect leaked resources, and then retry the failure code. A resource collector is designed to identify leaked resources and safely release them. We implement our approach for Java programs. Experimental results show that our approach can successfully handle resource exhaustion exceptions caused by reported resource leaks and allow programs to complete their tasks with an average execution time increase of 2.52%and negligible bytecode size increase.

  1. A technical assistance on data collection on subdivision of wet-system apparatuses.

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-09-01

    In the Ningyo-Toge Environmental Engineering Center, development on subdivision engineering system for abolition of nuclear fuel facilities has been promoted. However, data on subdivision of instruments and apparatuses to be carried out as a part of the abolition was insufficient. Therefore, here was intended to investigate data collections so as to use subdivision of a wet-system apparatuses of the smelting conversion facility begun on June, 2000, as a field of data collection as effectively as possible, on construction of the system rationally supporting abolition of nuclear fuel facility promoted at the Ningyo-Toge Environmental Engineering Center. This subdivision of the wet-system apparatuses of the facility is programmed to carry out the subdivision for two years of 2000 and 2001 fiscal years. Its working procedure is begun from non-polluted matters (electrics, instruments, and utility pipings) at every rooms to carry out appliances using uranium. Here were reported on present states survey of the subdivision, kinds and frequencies of data at the subdivision, data collection manual, and rationalization of data recording method. (G.K.)

  2. D3.2 Initial Specification of Data Collection and Analysis System

    DEFF Research Database (Denmark)

    Siksnys, Laurynas; Kaulakiene, Dalia; Pedersen, Torben Bach

    2010-01-01

    with such technology and will be sending tens of flexible offers per day. In order to appropriately manage very large volumes of flexible offers, a reliable, distributed, and highly scalable computer system infrastructure is needed. Work Package 3 concerns data collection, aggregation and storage solutions...... for the MIRACLE system. This deliverable, which is a part of Work Package 3, specifies a data collection and management system, its components, and presents a methodology for flexible offers aggregation.......) project aims to invent and prototype key elements of an energy system that is better able to accommodate large volumes of electricity from RES. The approach is based on flexible offers that allow an individual consumer/producer to specify when and what amount of energy he or she wants to consume...

  3. Automatic TLI recognition system. Part 1: System description

    Energy Technology Data Exchange (ETDEWEB)

    Partin, J.K.; Lassahn, G.D.; Davidson, J.R.

    1994-05-01

    This report describes an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system uses image data fusion and gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. This volume gives a general description of the ATR system.

  4. SYRIAC: The systematic review information automated collection system a data warehouse for facilitating automated biomedical text classification.

    Science.gov (United States)

    Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S

    2008-11-06

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.

  5. Intelligent Storage System Based on Automatic Identification

    Directory of Open Access Journals (Sweden)

    Kolarovszki Peter

    2014-09-01

    Full Text Available This article describes RFID technology in conjunction with warehouse management systems. Article also deals with automatic identification and data capture technologies and each processes, which are used in warehouse management system. It describes processes from entering goods into production to identification of goods and also palletizing, storing, bin transferring and removing goods from warehouse. Article focuses on utilizing AMP middleware in WMS processes in Nowadays, the identification of goods in most warehouses is carried through barcodes. In this article we want to specify, how can be processes described above identified through RFID technology. All results are verified by measurement in our AIDC laboratory, which is located at the University of Žilina, and also in Laboratory of Automatic Identification Goods and Services located in GS1 Slovakia. The results of our research bring the new point of view and indicate the ways using of RFID technology in warehouse management system.

  6. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    OpenAIRE

    Boden, T. A.; M. Krassovski; Yang, B.

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in Nor...

  7. Detection of impulsive sources from an aerostat-based acoustic array data collection system

    Science.gov (United States)

    Prather, Wayne E.; Clark, Robert C.; Strickland, Joshua; Frazier, Wm. Garth; Singleton, Jere

    2009-05-01

    An aerostat based acoustic array data collection system was deployed at the NATO TG-53 "Acoustic Detection of Weapon Firing" Joint Field Experiment conducted in Bourges, France during the final two weeks of June 2008. A variety of impulsive sources including mortar, artillery, gunfire, RPG, and explosive devices were fired during the test. Results from the aerostat acoustic array will be presented against the entire range of sources.

  8. A mobile field-work data collection system for the wireless era of health surveillance

    Directory of Open Access Journals (Sweden)

    Marianne Forsell

    2011-02-01

    Full Text Available In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on field-work experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS to a Local Area Network (LAN interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from field-work data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  9. ENT COBRA (Consortium for Brachytherapy Data Analysis): interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy)

    Science.gov (United States)

    Tagliaferri, Luca; Kovács, György; Budrukkar, Ashwini; Guinot, Jose Luis; Hildebrand, Guido; Johansson, Bengt; Monge, Rafael Martìnez; Meyer, Jens E.; Niehoff, Peter; Rovirosa, Angeles; Takàcsi-Nagy, Zoltàn; Dinapoli, Nicola; Lanzotti, Vito; Damiani, Andrea; Soror, Tamer; Valentini, Vincenzo

    2016-01-01

    Purpose Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. Material and methods GEC-ESTRO (Groupe Européen de Curiethérapie – European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Results Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis), Procedures (prediction models and DSS), and Research (radiomics). The COBRA-Storage System (C-SS) is not time-consuming as, thanks to the use of “brokers”, data can be extracted directly from the single center's storage systems through a connection with “structured query language database” (SQL-DB), Microsoft Access®, FileMaker Pro®, or Microsoft Excel®. The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of “on-purpose data projection”. The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called “distributed learning” approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Conclusions Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center's own data storing

  10. Digital signal processing for CdTe detectors using VXIbus data collection systems

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, Daiji; Takahashi, Hiroyuki; Kurahashi, Tomohiko; Iguchi, Tetsuo; Nakazawa, Masaharu

    1996-07-01

    Recently fast signal digitizing technique has been developed, and signal waveforms with very short time periods can be obtained. In this paper, we analyzed each measured pulse which was digitized by an apparatus of this kind, and tried to improve an energy resolution of a CdTe semiconductor detector. The result of the energy resolution for {sup 137}Cs 662 keV photopeak was 13 keV. Also, we developed a fast data collection system based on VXIbus standard, and the counting rate on this system was obtained about 50 counts per second. (author)

  11. Data network, collection, and analysis in the Diamond Valley flow system, central Nevada

    Science.gov (United States)

    Knochenmus, Lari A.; Berger, David L.; Moreo, Michael T.; Smith, J. LaRue

    2011-01-01

    Future groundwater development and its effect on future municipal, irrigation, and alternative energy uses in the Diamond Valley flow system are of concern for officials in Eureka County, Nevada. To provide a better understanding of the groundwater resources, the U.S. Geological Survey, in cooperation with Eureka County, commenced a multi-phase study of the Diamond Valley flow system in 2005. Groundwater development primarily in southern Diamond Valley has resulted in water-level declines since the 1960s ranging from less than 5 to 100 feet. Groundwater resources in the Diamond Valley flow system outside of southern Diamond Valley have been relatively undeveloped. Data collected during phase 2 of the study (2006-09) included micrometeorological data at 4 evapotranspiration stations, 3 located in natural vegetation and 1 located in an agricultural field; groundwater levels in 95 wells; water-quality constituents in aquifers and springs at 21 locations; lithologic information from 7 recently drilled wells; and geophysical logs from 3 well sites. This report describes what was accomplished during phase 2 of the study, provides the data collected, and presents the approaches to strengthen relations between evapotranspiration rates measured at micrometeorological stations and spatially distributed groundwater discharge. This report also presents the approach to improve delineation of areas of groundwater discharge and describes the current methodology used to improve the accuracy of spatially distributed groundwater discharge rates in the Diamond Valley flow system.

  12. The AmeriFlux Data Activity and Data System: An Evolving Collection of Data Management Techniques, Tools, Products and Services

    Energy Technology Data Exchange (ETDEWEB)

    Boden, Thomas A [ORNL; Krassovski, Misha B [ORNL; Yang, Bai [ORNL

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the U.S. Department of Energy and international climate change science since 1982. Over this period, climate change science has expanded from research focusing on basic understanding of geochemical cycles, particularly the carbon cycle, to integrated research addressing climate change impacts, vulnerability, adaptation, and mitigation. Interests in climate change data and information worldwide have grown remarkably and, as a result, so have demands and expectations for CDIAC s data systems. To meet the growing demands, CDIAC s strategy has been to design flexible data systems using proven technologies blended with new, evolving technologies and standards. CDIAC development teams are multidisciplinary and include computer science and information technology expertise, but also scientific expertise necessary to address data quality and documentation issues and to identify data products and system capabilities needed by climate change scientists. CDIAC has learned there is rarely a single commercial tool or product readily available to satisfy long-term scientific data system requirements (i.e., one size does not fit all and the breadth and diversity of environmental data are often too complex for easy use with commercial products) and typically deploys a variety of tools and data products in an effort to provide credible data freely to users worldwide. Like many scientific data management applications, CDIAC s data systems are highly customized to satisfy specific scientific usage requirements (e.g., developing data products specific for model use) but are also designed to be flexible and interoperable to take advantage of new software engineering techniques, standards (e.g., metadata standards) and tools and to support future Earth system data efforts (e.g., ocean acidification). CDIAC has provided data management

  13. Creating an iPhone Application for Collecting Continuous ABC Data

    Science.gov (United States)

    Whiting, Seth W.; Dixon, Mark R.

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…

  14. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    Energy Technology Data Exchange (ETDEWEB)

    Hiraki, Masahiko, E-mail: masahiko.hiraki@kek.jp; Yamada, Yusuke; Chavas, Leonard M. G. [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); Wakatsuki, Soichi [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan); SLAC National Accelerator Laboratory, 2575 Sand Hill Road, MS 69, Menlo Park, CA 94025-7015 (United States); Stanford University, Beckman Center B105, Stanford, CA 94305-5126 (United States); Matsugaki, Naohiro [High Energy Accelerator Research Organization, 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2013-11-01

    A special liquid-nitrogen Dewar with double capacity for the sample-exchange robot has been created at AR-NE3A at the Photon Factory, allowing continuous fully automated data collection. In this work, this new system is described and the stability of its calibration is discussed. Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable.

  15. Geophysical data collection using an interactive personal computer system. Part 1. ; Experimental monitoring of Suwanosejima volcano

    Energy Technology Data Exchange (ETDEWEB)

    Iguchi, M. (Kyoto Univerdity, Kyoto (Japan). Disaster Prevention Reserach Institute)

    1991-10-15

    In the article, a computer-communication system was developed in order to collect geophysical data from remote volcanos via a public telephpne network. This system is composed of a host presonal computer at an observatory and several personal computers as terminals at remote stations. Each terminal acquires geophysical data, such as seismic, intrasonic, and ground deformation date. These gara are stored in the terminals temporarily, and transmitted to the host computer upon command from host computer. Experimental monitoring was conducted between Sakurajima Volcanological Observatory and several statins in the Satsunan Islands and southern Kyushu. The seismic and eruptive activities of Suwanosejima volcano were monitored by this system. Consequently, earthquakes and air-shocks accompanied by the explosive activity were observed. B-type earthquakes occurred prio to the relatively prolonged eruptive activity. Intermittent occurrences of volcanic tremors were also clearly recognized from the change in mean amplitubes of seismic waves. 7 refs., 10 figs., 2 tabs.

  16. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  17. Electronic thermal sensor and Data Collection Platform technology: Part 5 in Thermal surveillance of active volcanoes using the Landsat-1 Data Collection System

    Science.gov (United States)

    Preble, Duane M.; Friedman, Jules D.; Frank, David

    1976-01-01

    Five Data Collection Platforms (DCP) were integrated electronically with thermall sensing systems, emplaced and operated in an analog mode at selected thermally significant volcanic and geothermal sites. The DCP's transmitted 3260 messages comprising 26,080 ambient, surface, and near-surface temperature records at an accuracy of ±1.15 °C for 1121 instrument days between November 14, 1972 and April 17, 1974. In harsh, windy, high-altitude volcanic environments the DCP functioned best with a small dipole antenna. Sixteen kg of alkaline batteries provided a viable power supply for the DCP systems, operated at a low-duty cycle, for 5 to 8 months. A proposed solar power supply system would lengthen the period of unattended operation of the system considerably. Special methods of data handling such as data storage via a proposed memory system would increase the significance of the twice-daily data reception enabling the DCP's to record full diurnal-temperature cycles at volcanic or geothermal sites. Refinements in the temperature-monitoring system designed and operated in experiment SR 251 included a backup system consisting of a multipoint temperature scanner, a servo mechanism and an analog-to-digital recorder. Improvements were made in temperature-probe design and in construction of corrosion-resistant seals by use of a hydrofluoric-acid-etching technique.

  18. Creating an iPhone application for collecting continuous ABC data.

    Science.gov (United States)

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.

  19. Using global positioning systems in health research a practical approach to data collection and processing

    DEFF Research Database (Denmark)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper

    2011-01-01

    The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons across...... studies and populations, and advance this field more rapidly. This paper aims to take researchers, who are considering using GPS devices in their research, through device-selection criteria, device settings, participant data collection, data cleaning, data processing, and integration of data into GIS....... Recommendations are outlined for each stage of data collection and analysis and indicates challenges that should be considered. This paper highlights the benefits of collecting GPS data over traditional self-report or estimated exposure measures. Information presented here will allow researchers to make...

  20. Data collection systems in ART must follow the pace of change in clinical practice.

    Science.gov (United States)

    De Geyter, Ch; Wyns, C; Mocanu, E; de Mouzon, J; Calhaz-Jorge, C

    2016-10-01

    In assisted reproductive technology (ART), quality control necessitates the collection of outcome data and occurring complications. Traditional quality assurance is based on data derived from single ART centres and more recently from national registries, both recording outcome parameters during well-defined observation periods. Nowadays, ART is moving towards much more diverse approaches, with sequential activities including short- or long-term freezing of gametes, gonadal tissues and embryos, and cross-border reproductive care. Hence, long-term cumulative treatment rates and an international approach are becoming a necessity. We suggest the initiation of an easy access European Reproductive Coding System, through which each ART recipient is allocated a unique reproductive care code. This code would identify individuals (and reproductive material) during case to case data reporting to national ART data collecting institutions and to a central European ART monitoring agency. For confidentiality reasons, the identity of the individuals should remain with the local ART provider. This way, cumulative and fully reliable reproductive outcome data can be constructed with follow-up over prolonged time periods.

  1. Automatic Recommender System Based on Data Mining%采用数据挖掘的自动化推荐技术的研究

    Institute of Scientific and Technical Information of China (English)

    陈庆章; 汤仲喆; 王凯; 姚敏; 裴玉洁

    2012-01-01

    With the rapid development of Internet, a large number of data of various type become huge and scattered. Using traditional keyword to search the data is more and more time-consuming. Therefore, the automatic recommender system emerges to reduce users' search time and provide them with more appropriate information, . By using ART neural network and data mining technology, this study builds a typical online recommendation system. It can automatically cluster population characteristics and mine the associated characteristics. At the same time, MART algorithm is proposed as a modified ART algorithm for clustering algorithm, which produces more reasonable and flexible clustering results.%随着网络的迅速发展,各种数据量变得庞大且分散,利用关键词检索数据的传统方式变得相当费时.为了减少用户在网络上的搜寻时间,提供用户更确切的内容信息,自动化推荐系统(Automatic Recommender System)应运而生.该研究将人工神经网络中的自适应共振理论(Adaptive Resonance Theory,ART)和数据挖掘技术结合起来,建构了一个可自动聚类族群特征且能挖掘出关联规则的自动化在线推荐机制.同时将用于用户聚类的ART算法进行了改进,提出了MART聚类算法,使由推荐系统得出的结果变得更加合理和灵活.

  2. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  3. Automatic gray scale correction of video data

    Science.gov (United States)

    Chochia, Pavel A.

    1995-01-01

    Automatic gray scale correction of captured video data (both still and moving images) is one of the least researched questions in the image processing area, in spite of this the question is touched almost in every book concerned with image processing. Classically it is related to the image enhancement, and frequently is classified as histogram modification techniques. Traditionally used algorithms, based on analysis of the image histogram, are not able to decide the problem properly. The investigating difficulties are associated with the absence of a formal quantitative estimate of image quality -- till now the most often used criteria are human visual perception and experience. Hence, the problem of finding out some measurable properties of real images, which might be the basis for automatic building of gray scale correction function (sometimes identified also as gamma-correction function), is still unsolved. In the paper we try to discern some common properties of real images that could help us to evaluate the gray scale image distortion, and, finally, to construct the appropriate correction function to enhance an image. Such a method might be sufficiently used for automatic image processing procedures, like enhancing of medical images, reproducing of pictures in the publishing industry, correcting of remote sensing images, preprocessing of captured data in the computer vision area, and for many other applications. The question of complexity of analysis procedure becomes important when an algorithm is realized in real-time (for example in video input devices, like video cameras).

  4. GIS: Geographic Information System An application for socio-economical data collection for rural area

    CERN Document Server

    Nayak, S K; Kalyankar, N V

    2010-01-01

    The country India follows the planning through planning commission. This is on the basis of information collected by traditional, tedious and manual method which is too slow to sustain. Now we are in the age of 21th century. We have seen in last few decades that the progress of information technology with leaps and bounds, which have completely changed the way of life in the developed nations. While internet has changed the established working practice and opened new vistas and provided a platform to connect, this gives the opportunity for collaborative work space that goes beyond the global boundary. We are living in the global economy and India leading towards Liberalize Market Oriented Economy (LMOE). Considering this things, focusing on GIS, we proposed a system for collection of socio economic data and water resource management information of rural area via internet.

  5. System Supporting Automatic Generation of Finite Element Using Image Information

    Institute of Scientific and Technical Information of China (English)

    J; Fukuda

    2002-01-01

    A mesh generating system has been developed in orde r to prepare large amounts of input data which are needed for easy implementation of a finite element analysis. This system consists of a Pre-Mesh Generator, an Automatic Mesh Generator and a Mesh Modifier. Pre-Mesh Generator produces the shape and sub-block information as input data of Automatic Mesh Generator by c arrying out various image processing with respect to the image information of th e drawing input using scanner. Automatic Mesh Generato...

  6. Overview of the SOFIA Data Processing System: A generalized system for manual and automatic data processing at the SOFIA Science Center

    CERN Document Server

    Shuping, R Y; Vacca, W D; Charcos-Llorens, M; Reach, W T; Alles, R; Clarke, M; Melchiorri, R; Radomski, J; Shenoy, S; Sandel, D; Omelian, E B

    2014-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5-meter telescope mounted in the aft section of a Boeing 747SP aircraft. During routine operations, several instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR. Raw data obtained in-flight require a significant amount of processing to correct for background emission (from both the telescope and atmosphere), remove instrumental artifacts, correct for atmospheric absorption, and apply both wavelength and flux calibration. In general, this processing is highly specific to the instrument and telescope. In order to maximize the scientific output of the observatory, the SOFIA Science Center must provide these post-processed data sets to Guest Investigators in a timely manner. To meet this requirement, we have designed and built the SOFIA Data Processing System (DPS): an in-house set of tools and services that can be used in both auto...

  7. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  8. Feasibility Study for Ballet E-Learning: Automatic Composition System for Ballet "Enchainement" with Online 3D Motion Data Archive

    Science.gov (United States)

    Umino, Bin; Longstaff, Jeffrey Scott; Soga, Asako

    2009-01-01

    This paper reports on "Web3D dance composer" for ballet e-learning. Elementary "petit allegro" ballet steps were enumerated in collaboration with ballet teachers, digitally acquired through 3D motion capture systems, and categorised into families and sub-families. Digital data was manipulated into virtual reality modelling language (VRML) and fit…

  9. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  10. Digital data collection in paleoanthropology.

    Science.gov (United States)

    Reed, Denné; Barr, W Andrew; Mcpherron, Shannon P; Bobe, René; Geraads, Denis; Wynn, Jonathan G; Alemseged, Zeresenay

    2015-01-01

    Understanding patterns of human evolution across space and time requires synthesizing data collected by independent research teams, and this effort is part of a larger trend to develop cyber infrastructure and e-science initiatives. At present, paleoanthropology cannot easily answer basic questions about the total number of fossils and artifacts that have been discovered, or exactly how those items were collected. In this paper, we examine the methodological challenges to data integration, with the hope that mitigating the technical obstacles will further promote data sharing. At a minimum, data integration efforts must document what data exist and how the data were collected (discovery), after which we can begin standardizing data collection practices with the aim of achieving combined analyses (synthesis). This paper outlines a digital data collection system for paleoanthropology. We review the relevant data management principles for a general audience and supplement this with technical details drawn from over 15 years of paleontological and archeological field experience in Africa and Europe. The system outlined here emphasizes free open-source software (FOSS) solutions that work on multiple computer platforms; it builds on recent advances in open-source geospatial software and mobile computing.

  11. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    Science.gov (United States)

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  12. DataCollection Prototyping

    CERN Multimedia

    Beck, H.P.

    DataCollection is a subsystem of the Trigger, DAQ & DCS project responsible for the movement of event data from the ROS to the High Level Triggers. This includes data from Regions of Interest (RoIs) for Level 2, building complete events for the Event Filter and finally transferring accepted events to Mass Storage. It also handles passing the LVL1 RoI pointers and the allocation of Level 2 processors and load balancing of Event Building. During the last 18 months DataCollection has developed a common architecture for the hardware and software required. This involved a radical redesign integrating ideas from separate parts of earlier TDAQ work. An important milestone for this work, now achieved, has been to demonstrate this subsystem in the so-called Phase 2A Integrated Prototype. This prototype comprises the various TDAQ hardware and software components (ROSs, LVL2, etc.) under the control of the TDAQ Online software. The basic functionality has been demonstrated on small testbeds (~8-10 processing nodes)...

  13. Tracing where IoT data are collected and aggregated

    OpenAIRE

    Bodei, Chiara; Degano, Pierpaolo; Ferrari, Gian-Luigi; Galletta, Letterio

    2016-01-01

    The Internet of Things (IoT) offers the infrastructure of the information society. It hosts smart objects that automatically collect and exchange data of various kinds, directly gathered from sensors or generated by aggregations. Suitable coordination primitives and analysis mechanisms are in order to design and reason about IoT systems, and to intercept the implied technological shifts. We address these issues from a foundational point of view. To study them, we define IoT-LySa, a process ca...

  14. Automatic Age Estimation System for Face Images

    OpenAIRE

    Chin-Teng Lin; Dong-Lin Li; Jian-Hao Lai; Ming-Feng Han; Jyh-Yeong Chang

    2012-01-01

    Humans are the most important tracking objects in surveillance systems. However, human tracking is not enough to provide the required information for personalized recognition. In this paper, we present a novel and reliable framework for automatic age estimation based on computer vision. It exploits global face features based on the combination of Gabor wavelets and orthogonal locality preserving projections. In addition, the proposed system can extract face aging features automatically in rea...

  15. Steam System Balancing and Tuning for Multifamily Residential Buildings in Chicagoland - Second Year of Data Collection

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.; Ludwig, P.; Brand, L.

    2013-08-01

    Steam heated buildings often suffer from uneven heating as a result of poor control of the amount of steam entering each radiator. In order to satisfy the heating load to the coldest units, other units are overheated. As a result, some tenants complain of being too hot and open their windows in the middle of winter, while others complain of being too cold and are compelled to use supplemental heat sources. Building on previous research, CNT Energy identified 10 test buildings in Chicago and conducted a study to identify best practices for the methodology, typical costs, and energy savings associated with steam system balancing. A package of common steam balancing measures was assembled and data were collected on the buildings before and after these retrofits were installed to investigate the process, challenges, and the cost effectiveness of improving steam systems through improved venting and control systems. The test buildings that received venting upgrades and new control systems showed 10.2% savings on their natural gas heating load, with a simple payback of 5.1 years. The methodologies for and findings from this study are presented in detail in this report. This report has been updated from a version published in August 2012 to include natural gas usage information from the 2012 heating season and updated natural gas savings calculations.

  16. Work Zone Data Collection Trailer

    Data.gov (United States)

    Federal Laboratory Consortium — The Work Zone Data Collection Trailer was designed and constructed to enhance data collection and analysis capabilities for the "Evaluating Roadway Construction Work...

  17. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  18. Automatic registration method for mobile LiDAR data

    Science.gov (United States)

    Wang, Ruisheng; Ferrie, Frank P.

    2015-01-01

    We present an automatic mutual information (MI) registration method for mobile LiDAR and panoramas collected from a driving vehicle. The suitability of MI for registration of aerial LiDAR and aerial oblique images has been demonstrated under an assumption that minimization of joint entropy (JE) is a sufficient approximation of maximization of MI. We show that this assumption is invalid for the ground-level data. The entropy of a LiDAR image cannot be regarded as approximately constant for small perturbations. Instead of minimizing the JE, we directly maximize MI to estimate corrections of camera poses. Our method automatically registers mobile LiDAR with spherical panoramas over an approximate 4-km drive, and is the first example we are aware of that tests MI registration in a large-scale context.

  19. An Automatic KANSEI Fuzzy Rule Creating System Using Thesaurus

    Science.gov (United States)

    Hotta, Hajime; Hagiwara, Masafumi

    In this paper, we propose an automatic Kansei fuzzy rule creating system using thesaurus. In general, there are a lot of words that express impressions. However, conventional approaches of Kansei engineering are not suitable to use many impression words because it is difficult to collect enough data. The proposed system is an enhanced algorithm of the conventional method that the authors proposed before. The proposed system extracts fuzzy rules for many words defined in the thesaurus dictionary while the conventional one can extract rules of specified words which user defined. The flow of the system consists of 3 steps: (1) construction of thesaurus networks; (2) data collection by web questionnaire sheets; (3) Extraction of fuzzy rules. In order to extract Kansei fuzzy rules, the system employs enhanced GRNN(general regression neural network) which can treat relative words of the thesaurus network. Using a Japanese thesaurus dictionary in the experiments, the sets of fuzzy rules for 1,195 impression words are extracted, and the fuzzy rules extracted by the proposed system obtained higher accuracy than those extracted by the conventional one.

  20. 49 CFR Appendix H to Part 40 - DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form

    Science.gov (United States)

    2010-10-01

    ..., App. H Appendix H to Part 40—DOT Drug and Alcohol Testing Management Information System (MIS) Data... 49 Transportation 1 2010-10-01 2010-10-01 false DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form H Appendix H to Part 40 Transportation Office of the...

  1. Robust indexing for automatic data collection

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2003-12-09

    We present improved methods for indexing diffraction patterns from macromolecular crystals. The novel procedures include a more robust way to verify the position of the incident X-ray beam on the detector, an algorithm to verify that the deduced lattice basis is consistent with the observations, and an alternative approach to identify the metric symmetry of the lattice. These methods help to correct failures commonly experienced during indexing, and increase the overall success rate of the process. Rapid indexing, without the need for visual inspection, will play an important role as beamlines at synchrotron sources prepare for high-throughput automation.

  2. Overcoming Language and Literacy Barriers: Using Student Response System Technology to Collect Quality Program Evaluation Data from Immigrant Participants

    Science.gov (United States)

    Walker, Susan K.; Mao, Dung

    2016-01-01

    Student response system technology was employed for parenting education program evaluation data collection with Karen adults. The technology, with translation and use of an interpreter, provided an efficient and secure method that respected oral language and collective learning preferences and accommodated literacy needs. The method was popular…

  3. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  4. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  5. Revivification of Intelligence Data Collection System Based on SCM%基于单片机的智能数据采集系统

    Institute of Scientific and Technical Information of China (English)

    徐淑彦; 李世雄; 苏亦白

    2011-01-01

    随着经济的飞速发展和科学技术水平的不断提高,智能数据采集系统在工业生产以及科学研究中得到了广泛的应用.在信息化时代,数据和信息无疑成为一种重要的资源,而数据采集系统的出现更是进一步促进了人机交互、对设备的自动检测控制等的实现,为现代化工业生产提供了方便.本文将分析基于单片机的智能数据采集系统的研究必要性,阐述基于单片机的智能数据采集系统的设计要点及其具体方法,以期对基于单片机的智能数据采集系统的改造和创新做出应有的贡献.%With the rapid development of economy and the increase of scientific and technological level, intelligence data collection system is widely used in the industrial production and scientific studies. In the information age, data and information had become the important resource and data collection system further promotes the man-machine interaction and automatic detection control of the device, providing convenience for the modern industrial production. This article examines necessity of the research on intelligence data collection system based on SCM, and expounds the key points of design and the methods, to contribute to the reform and innovation of intelligent data collection system based on SCM.

  6. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and...

  7. Soft design for automatic detection system of flight test data%飞行试验数据自检测系统软件设计

    Institute of Scientific and Technical Information of China (English)

    许应康; 彭国金; 刘威

    2015-01-01

    在飞行试验过程中,因缺乏对海量试飞原始数据和预处理结果数据的快速检查和故障定位,导致试飞工程师分析数据的难度急剧增加,影响试飞型号任务的高效进行。在此针对上述问题对飞行试验数据进行自检测技术研究,设计了一个基于参数信息专家数据库的原始数据与预处理结果数据的自动化检测系统软件。该软件可对原始数据进行自动化检测,同时依据专家数据库的自定义判据,对预处理结果数据进行自动化检测和计算处理。经过软件测试和应用,该软件能够有效地解决原始数据和预处理结果数据中的数据异常和错误,提高试飞工程师对海量试飞数据的分析效率。%In the flight test,the difficulty of data analysis is sharply rising due to the lack of quick detecting and fault posi⁃tioning to huge raw and preprocessed data,which prevents the flight test mission from being executed effectively. The software for an automatic detection system of original data and preprocessed result data based on expert database of parameter information was designed for solving the problems above. The software can check original data and preprocessed data automatically according to the self⁃defined criterion of the expert database. The testing result and application of the software show that the software can effectively deal with data exception and error in original data and preprocessed result data,and improve the analysis efficiency of flight engineers for huge flight test data.

  8. Automatic Age Estimation System for Face Images

    Directory of Open Access Journals (Sweden)

    Chin-Teng Lin

    2012-11-01

    Full Text Available Humans are the most important tracking objects in surveillance systems. However, human tracking is not enough to provide the required information for personalized recognition. In this paper, we present a novel and reliable framework for automatic age estimation based on computer vision. It exploits global face features based on the combination of Gabor wavelets and orthogonal locality preserving projections. In addition, the proposed system can extract face aging features automatically in real‐time. This means that the proposed system has more potential in applications compared to other semi‐automatic systems. The results obtained from this novel approach could provide clearer insight for operators in the field of age estimation to develop real‐world applications.

  9. Refinements to the Boolean approach to automatic data editing

    Energy Technology Data Exchange (ETDEWEB)

    Liepins, G.E.

    1980-09-01

    Automatic data editing consists of three components: identification of erroneous records, identification of most likely erroneous fields within an erroneous record (fields to impute), and assignment of acceptable values to failing records. Moreover the types of data considered naturally fall into three categories: coded (categorical) data, continuous data, and mixed data (both coded and continuous). For the case of coded data, a natural way to approach automatic data is commonly referred to as the Boolean approach, first developed by Fellegi and Holt. For the fields to impute problem, central to the operation of the Fellegi-Holt approach is the explicit recognition of certain implied edits; Fellegi and Holt orginally required a complete set of edits, and their algorithm to generate this complete set has occasionally had the distinct disadvantage of failing to converge within reasonable time. The primary results of this paper is an algorithm that significantly prunes the Fellegi-Holt edit generation process, yet, nonetheless, generates a sufficient collection of implied edits adequate for the solution of the fields to impute problem. 3 figures.

  10. A Bottom-Up Approach for Automatically Grouping Sensor Data Layers by their Observed Property

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-01-01

    Full Text Available The Sensor Web is a growing phenomenon where an increasing number of sensors are collecting data in the physical world, to be made available over the Internet. To help realize the Sensor Web, the Open Geospatial Consortium (OGC has developed open standards to standardize the communication protocols for sharing sensor data. Spatial Data Infrastructures (SDIs are systems that have been developed to access, process, and visualize geospatial data from heterogeneous sources, and SDIs can be designed specifically for the Sensor Web. However, there are problems with interoperability associated with a lack of standardized naming, even with data collected using the same open standard. The objective of this research is to automatically group similar sensor data layers. We propose a methodology to automatically group similar sensor data layers based on the phenomenon they measure. Our methodology is based on a unique bottom-up approach that uses text processing, approximate string matching, and semantic string matching of data layers. We use WordNet as a lexical database to compute word pair similarities and derive a set-based dissimilarity function using those scores. Two approaches are taken to group data layers: mapping is defined between all the data layers, and clustering is performed to group similar data layers. We evaluate the results of our methodology.

  11. Automatic Data Normalization and Parameterization for Optical Motion Tracking

    Directory of Open Access Journals (Sweden)

    Leif Kobbelt

    2006-09-01

    Full Text Available Methods for optical motion capture often require time-consuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [ HSK05 ]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.

  12. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  13. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple t...

  14. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    2004-01-01

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitte

  15. Feedback Improvement in Automatic Program Evaluation Systems

    Science.gov (United States)

    Skupas, Bronius

    2010-01-01

    Automatic program evaluation is a way to assess source program files. These techniques are used in learning management environments, programming exams and contest systems. However, use of automated program evaluation encounters problems: some evaluations are not clear for the students and the system messages do not show reasons for lost points.…

  16. Automatic contour welder incorporates speed control system

    Science.gov (United States)

    Wall, W. A., Jr.

    1968-01-01

    Speed control system maintains the welding torch of an automatic welder at a substantially constant speed. The system is particularly useful when welding contoured or unusually shaped surfaces, which cause the distance from the work surface to the weld carriage to vary in a random manner.

  17. Automatic solar lamp intensity control system

    Science.gov (United States)

    Leverone, H.; Mandell, N.

    1968-01-01

    System that substitutes solar cells directly in the path of the radiation incident on the test volume and uses a dc bridge-null system was developed. The solar cell is affixed to a heat sink mounted on each of three arms for each solar lamp. Control of the radiation from the solar lamps is automatic.

  18. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  19. Tightly integrated single- and multi-crystal data collection strategy calculation and parallelized data processing in JBluIce beamline control system.

    Science.gov (United States)

    Pothineni, Sudhir Babu; Venugopalan, Nagarajan; Ogata, Craig M; Hilgart, Mark C; Stepanov, Sergey; Sanishvili, Ruslan; Becker, Michael; Winter, Graeme; Sauter, Nicholas K; Smith, Janet L; Fischetti, Robert F

    2014-12-01

    The calculation of single- and multi-crystal data collection strategies and a data processing pipeline have been tightly integrated into the macromolecular crystallographic data acquisition and beamline control software JBluIce. Both tasks employ wrapper scripts around existing crystallographic software. JBluIce executes scripts through a distributed resource management system to make efficient use of all available computing resources through parallel processing. The JBluIce single-crystal data collection strategy feature uses a choice of strategy programs to help users rank sample crystals and collect data. The strategy results can be conveniently exported to a data collection run. The JBluIce multi-crystal strategy feature calculates a collection strategy to optimize coverage of reciprocal space in cases where incomplete data are available from previous samples. The JBluIce data processing runs simultaneously with data collection using a choice of data reduction wrappers for integration and scaling of newly collected data, with an option for merging with pre-existing data. Data are processed separately if collected from multiple sites on a crystal or from multiple crystals, then scaled and merged. Results from all strategy and processing calculations are displayed in relevant tabs of JBluIce.

  20. Hindi Digits Recognition System on Speech Data Collected in Different Natural Noise Environments

    OpenAIRE

    2015-01-01

    This paper presents a baseline digits speech recogn izer for Hindi language. The recording environment is different for all speakers, since th e data is collected in their respective homes. The different environment refers to vehicle horn no ises in some road facing rooms, internal background noises in some rooms like opening doors, silence in some rooms etc. All these recordings are used for training acoustic m...

  1. Algorithm of Automatic Recommender System Based on Data Mining%基于数据挖掘的自动化推荐系统算法

    Institute of Scientific and Technical Information of China (English)

    朱文忠

    2012-01-01

    A typical online recommendation system is described.By using ART neural network and data mining technology,it can automatically cluster population characteristics and can dig out the associated characteristics.Aim at the online recommendation system applied on network,how effectively use data mining techniques to mine the complete knowledge from a large number of databases is discussed,then the appropriate information is recommended to users to help them to find really needed and useful documents or information in the vast flow of information.A pattern is put forward that combines ART neural network and data mining technology.Aim at the characteristics of recommendation system,a new modified ART algorithm(MART algorithm) is proposed.The result shows that the proposed algorithm is effective.%结合人工神经网络中的自适应共振理论(ART)及数据挖掘(Data Mining)技术来建构一个可自动聚类族群特征且能挖掘出关联特质的自动化在线推荐系统。探讨如何有效地运用数据挖掘技术从大量的数据库中挖掘出完整知识,以推荐适当的信息给使用者,帮助他们在浩大的信息流中找到真正需要、有用的文件或信息。整合ART及数据挖掘技术,并针对推荐系统的特性提出一种改进的ART算法(MART算法)。实例验证了算法的有效性。

  2. From Automatic to Adaptive Data Acquisition

    DEFF Research Database (Denmark)

    Chang, Marcus

    2009-01-01

    Sensornets have been used for ecological monitoring the past decade, yet the main driving force behind these deployments are still computer scien- tists. The denser sampling and added modalities oered by sensornets could drive these elds in new directions, but not until the domain scientists be......- come familiar with sensornets and use them as any other instrument in their toolbox. We explore three dierent directions in which sensornets can become easier to deploy, collect data of higher quality, and oer more exibility, and we postulate that sensornets should be instruments for domain scientists...... the exibility of sensornets and reduce the complexity for the domain scientist, we developed an AI-based controller to act as a proxy between the scientist and sensornet. This controller is driven by the scientist's requirements to the collected data, and uses adaptive sampling in order to reach these goals....

  3. Water quality data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP), 1996 - 1998 (NCEI Accession 0000789)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality data in 22 reserves in the United States and...

  4. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP), 1994 - 2005 (NCEI Accession 0019215)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 25...

  5. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850...

  6. 75 FR 27001 - Comment Request for Information Collection for the SCSEP Data Collection System, OMB Control No...

    Science.gov (United States)

    2010-05-13

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR.... L. 109-365) and the Jobs for Veterans Act of 2002 (Pub. L. 07-288); changes in overall burden for... the proposed information collection request (ICR) can be obtained by contacting the office...

  7. Automatic Positioning System of Small Agricultural Robot

    Science.gov (United States)

    Momot, M. V.; Proskokov, A. V.; Natalchenko, A. S.; Biktimirov, A. S.

    2016-08-01

    The present article discusses automatic positioning systems of agricultural robots used in field works. The existing solutions in this area have been analyzed. The article proposes an original solution, which is easy to implement and is characterized by high- accuracy positioning.

  8. Automatic Road Sign Inventory Using Mobile Mapping Systems

    Science.gov (United States)

    Soilán, M.; Riveiro, B.; Martínez-Sánchez, J.; Arias, P.

    2016-06-01

    The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS) use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS), thus, road sign information can be available to be used in a Smart City context.

  9. Automatic multidiagnosis system for slit lamp

    Science.gov (United States)

    Ventura, Liliane; Chiaradia, Caio; Vieira Messias, Andre M.; Faria de Sousa, Sidney J.; Isaac, Flavio; Caetano, Cesar A. C.; Rosa Filho, Andre B.

    2001-06-01

    We have developed a system for several automatic diagnose in Slit Lamp in order to provide 04 additional measurements to the biomicroscope: (1) counting of the endothelial cells of donated corneas; (2) automatic keratometry; (3) corneal ulcer evaluation; (4) measurement of linear distances and areas of the ocular image. The system consists in a Slit Lamp, a beam-splitter, some optical components, a CCD detector, a frame grabber and a PC. The optical components attached to the beam-splitter are the same for all the functions, except for 1. For function 1, we have developed an optical system that magnifies the image 290X and a software that counts the cells interactively and automatically. Results are in good agreement with commercial specular microscopes (correlation coefficient is 0,98081). The automatic keratometry function is able to measure cylinders over 30 di and also irregular astigmatisms. The system consists of projecting a light ring at the patient's cornea and the further analysis of the deformation of the ring provides the radius of curvature as well as the axis of the astigmatism. The nominal precision is 0,005 mm for the curvature radius and 2 degree(s) for the axis component. The results are in good agreement with commercial systems (correlation coefficient of 0,99347). For function 3, the ulcer is isolated by the usual clinical ways and the image of the green area is automatically detected by the developed software in order to evaluate the evolution of the disease. Function 4 simply allows the clinician do any linear or area measurement of the ocular image. The system is a low cost multi evaluation equipment and it is being used in a public hospital in Brazil.

  10. Automatic systems win; Siegeszug der Automaten

    Energy Technology Data Exchange (ETDEWEB)

    Sorg, M

    2001-07-01

    This short article presents figures on the increasing use of modern, automatic wood-fired heating systems in Switzerland that are not only replacing older installations but also starting to replace other forms of heating. The increase of the number of wood-based heating systems installed and the amount of wood used in them is discussed, as are developments in the market for large-scale wood-based heating systems.

  11. 29 CFR 42.21 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Data collection. 42.21 Section 42.21 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.21 Data collection. (a) For each protective statute, ESA... completed based on complaints. (g) The National Committee shall review the data collection systems of...

  12. Precision laser automatic tracking system.

    Science.gov (United States)

    Lucy, R F; Peters, C J; McGann, E J; Lang, K T

    1966-04-01

    A precision laser tracker has been constructed and tested that is capable of tracking a low-acceleration target to an accuracy of about 25 microrad root mean square. In tracking high-acceleration targets, the error is directly proportional to the angular acceleration. For an angular acceleration of 0.6 rad/sec(2), the measured tracking error was about 0.1 mrad. The basic components in this tracker, similar in configuration to a heliostat, are a laser and an image dissector, which are mounted on a stationary frame, and a servocontrolled tracking mirror. The daytime sensitivity of this system is approximately 3 x 10(-10) W/m(2); the ultimate nighttime sensitivity is approximately 3 x 10(-14) W/m(2). Experimental tests were performed to evaluate both dynamic characteristics of this system and the system sensitivity. Dynamic performance of the system was obtained, using a small rocket covered with retroreflective material launched at an acceleration of about 13 g at a point 204 m from the tracker. The daytime sensitivity of the system was checked, using an efficient retroreflector mounted on a light aircraft. This aircraft was tracked out to a maximum range of 15 km, which checked the daytime sensitivity of the system measured by other means. The system also has been used to track passively stars and the Echo I satellite. Also, the system tracked passively a +7.5 magnitude star, and the signal-to-noise ratio in this experiment indicates that it should be possible to track a + 12.5 magnitude star.

  13. All-optical automatic pollen identification: Towards an operational system

    Science.gov (United States)

    Crouzy, Benoît; Stella, Michelle; Konzelmann, Thomas; Calpini, Bertrand; Clot, Bernard

    2016-09-01

    We present results from the development and validation campaign of an optical pollen monitoring method based on time-resolved scattering and fluorescence. Focus is first set on supervised learning algorithms for pollen-taxa identification and on the determination of aerosol properties (particle size and shape). The identification capability provides a basis for a pre-operational automatic pollen season monitoring performed in parallel to manual reference measurements (Hirst-type volumetric samplers). Airborne concentrations obtained from the automatic system are compatible with those from the manual method regarding total pollen and the automatic device provides real-time data reliably (one week interruption over five months). In addition, although the calibration dataset still needs to be completed, we are able to follow the grass pollen season. The high sampling from the automatic device allows to go beyond the commonly-presented daily values and we obtain statistically significant hourly concentrations. Finally, we discuss remaining challenges for obtaining an operational automatic monitoring system and how the generic validation environment developed for the present campaign could be used for further tests of automatic pollen monitoring devices.

  14. From Automatic to Adaptive Data Acquisition

    DEFF Research Database (Denmark)

    Chang, Marcus

    2009-01-01

    Sensornets have been used for ecological monitoring the past decade, yet the main driving force behind these deployments are still computer scien- tists. The denser sampling and added modalities oered by sensornets could drive these elds in new directions, but not until the domain scientists be...... the exibility of sensornets and reduce the complexity for the domain scientist, we developed an AI-based controller to act as a proxy between the scientist and sensornet. This controller is driven by the scientist's requirements to the collected data, and uses adaptive sampling in order to reach these goals....

  15. [Automatic analysis pipeline of next-generation sequencing data].

    Science.gov (United States)

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  16. Taxing the cloud: introducing a new taxation system on data collection?

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2013-05-01

    Full Text Available Cloud computing services are increasingly hosted on international servers and distributed amongst multiple data centres. Given their global scope, it is often easier for large multinational corporations to effectively circumvent old taxation schemes designed around the concept of territorial jurisdiction and geographical settings. In view of obtaining tax revenues from these online operators whose business is partially carried out in France, the French government recently issued a report emphasising the need for new taxation rules that would better comply with the way value is generated in the digital economy: at the international level, it is suggested that taxation should be calculated according to the place of interaction with end-users; at the national level, the report suggests to introduce a transitory tax on data collection in order to promote innovation and encourage good online practices.

  17. Gamma-ray spectrometry data collection and reduction by simple computing systems.

    Science.gov (United States)

    Op de Beeck, J

    1975-12-01

    The review summarizes the present state of the involvement of relatively small computing devices in the collection and processing of gamma-ray spectrum data. An economic and utilitarian point of view has been chosen with regard to data collection in order to arrive at practically valuable conclusions in terms of feasibility of possible configurations with respect to their eventual application. A unified point of view has been adopted with regard to data processing by developing an information theoretical approach on a more or less intuitive level in an attempt to remove the largest part of the virtual disparity between the several processing methods described in the literature. A synoptical introduction to the most important mathematical methods has been incorporated, together with a detailed theoretical description of the concept gamma-ray spectrum. In accordance with modern requirements, the discussions are mainly oriented towards high-resolution semiconductor detector-type spectra. The critical evaluation of the processing methods reviewed is done with respect to a set of predefined criteria. Smoothing, peak detection, peak intensity determination, overlapping peak resolving and detection and upper limits are discussed in great detail. A preferred spectrum analysis method combining powerful data reduction properties with extreme simplicity and speed of operation is suggested. The general discussion is heavily oriented towards activation analysis application, but other disciplines making use of gamma-ray spectrometry will find the material presented equally useful. Final conclusions are given pointing to future developments and shifting their centre of gravity towards improving the quality of the measurements rather than expanding the use of tedious and sophisticated mathematical techniques requiring the limits of available computational power.

  18. 34 CFR 303.540 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.540 Section 303.540 Education... DISABILITIES State Administration Reporting Requirements § 303.540 Data collection. (a) Each system must include the procedures that the State uses to compile data on the statewide system. The procedures...

  19. Automatic acquisition and classification system for agricultural network information based on Web data%基于Web数据的农业网络信息自动采集与分类系统

    Institute of Scientific and Technical Information of China (English)

    段青玲; 魏芳芳; 张磊; 肖晓琰

    2016-01-01

    The purpose of this study is to obtain agricultural web information efficiently, and to provide users with personalized service through the integration of agricultural resources scattered in different sites and the fusion of heterogeneous environmental data. The research in this paper has improved some key information technologies, which are agricultural web data acquisition and extraction technologies, text classification based on support vector machine (SVM) and heterogeneous data collection based on the Internet of things (IOT). We first add quality target seed site into the system, and get website URL (uniform resource locator) and category information. The web crawler program can save original pages. The de-noised web page can be obtained through HTML parser and regular expressions, which create custom Node Filter objects. Therefore, the system builds a document object model (DOM) tree before digging out data area. According to filtering rules, the target data area can be identified from a plurality of data regions with repeated patterns. Next, the structured data can be extracted after property segmentation. Secondly, we construct linear SVM classification model, and realize agricultural text classification automatically. The procedures of our model include 4 steps. First of all, we use segment tool ICTCLAS to carry out the word segment and part-of-speech (POS) tagging, followed by combining agricultural key dictionary and document frequency adjustment rule to choose feature words, and building a feature vector and calculating inverse document frequency (IDF) weight value for feature words; lastly we design adaptive classifier of SVM algorithm. Finally, the perception data of different format collected by the sensor are transmitted to the designated server as the source data through the wireless sensor network. Relational database in accordance with specified acquisition frequency can be achieved through data conversion and data filtering. The key step of

  20. Video Analytics Algorithm for Automatic Vehicle Classification (Intelligent Transport System

    Directory of Open Access Journals (Sweden)

    ArtaIftikhar

    2013-04-01

    Full Text Available Automated Vehicle detection and classification is an important component of intelligent transport system. Due to significant importance in various fields such as traffic accidents avoidance, toll collection, congestion avoidance, terrorist activities monitoring, security and surveillance systems, intelligent transport system has become important field of study. Various technologies have been used for detecting and classifying vehicles automatically. Automated vehicle detection is broadly divided into two types- Hardware based and software based detection. Various algorithms have been implemented to classify different vehicles from videos. In this paper an efficient and economical solution for automatic vehicle detection and classification is proposed. The proposed system first isolates the object through background subtraction followed by vehicle detection using ontology. Vehicle detection is based on low level features such as shape, size, and spatial location. Finally system classifies vehicles into one of the known classes of vehicle based on size.

  1. AUTOCLASS III - AUTOMATIC CLASS DISCOVERY FROM DATA

    Science.gov (United States)

    Cheeseman, P. C.

    1994-01-01

    The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5

  2. Automatic oscillator frequency control system

    Science.gov (United States)

    Smith, S. F. (Inventor)

    1985-01-01

    A frequency control system makes an initial correction of the frequency of its own timing circuit after comparison against a frequency of known accuracy and then sequentially checks and corrects the frequencies of several voltage controlled local oscillator circuits. The timing circuit initiates the machine cycles of a central processing unit which applies a frequency index to an input register in a modulo-sum frequency divider stage and enables a multiplexer to clock an accumulator register in the divider stage with a cyclical signal derived from the oscillator circuit being checked. Upon expiration of the interval, the processing unit compares the remainder held as the contents of the accumulator against a stored zero error constant and applies an appropriate correction word to a correction stage to shift the frequency of the oscillator being checked. A signal from the accumulator register may be used to drive a phase plane ROM and, with periodic shifts in the applied frequency index, to provide frequency shift keying of the resultant output signal. Interposition of a phase adder between the accumulator register and phase plane ROM permits phase shift keying of the output signal by periodic variation in the value of a phase index applied to one input of the phase adder.

  3. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  4. Ground-penetrating radar and differential global positioning system data collected from Long Beach Island, New Jersey, April 2015

    Science.gov (United States)

    Zaremba, Nicholas J.; Smith, Kathryn E.L.; Bishop, James M.; Smith, Christopher G.

    2016-08-04

    Scientists from the United States Geological Survey, St. Petersburg Coastal and Marine Science Center, U.S. Geological Survey Pacific Coastal and Marine Science Center, and students from the University of Hawaii at Manoa collected sediment cores, sediment surface grab samples, ground-penetrating radar (GPR) and Differential Global Positioning System (DGPS) data from within the Edwin B. Forsythe National Wildlife Refuge–Holgate Unit located on the southern end of Long Beach Island, New Jersey, in April 2015 (FAN 2015-611-FA). The study’s objective was to identify washover deposits in the stratigraphic record to aid in understanding barrier island evolution. This report is an archive of GPR and DGPS data collected from Long Beach Island in 2015. Data products, including raw GPR and processed DGPS data, elevation corrected GPR profiles, and accompanying Federal Geographic Data Committee metadata can be downloaded from the Data Downloads page.

  5. DATA INFORMATION SYSTEM TO PROMOTE THE ORGANIZATION DATA OF COLLECTIONS – MODELING CONSIDERATIONS BY THE UNIFIED MODELIGN LANGUAGE (UML

    Directory of Open Access Journals (Sweden)

    Eduardo Batista de Moraes Barbosa

    2011-05-01

    Full Text Available It can be argued that technological developments (e.g., measuring instruments like software, satellite and computers, as well as, the cheapening of storage media allow organizations to produce and acquire a great amount of data in a short time. Due to the data volume, research organizations become potentially vulnerable to the information explosion impacts. An adopted solution is the use of information system tools to assist data documentation, retrieval and analysis. In the scientific scope, these tools are developed to store different metadata (data about data patterns. During the development process of these tools, the adoption of standards such as the Unified Modeling Language (UML stands out, whose diagrams assist the different scopes of software modeling. The objective of this study is to present an information system tool that assists organizations in the data documentation through the use of metadata and that highlights the software modeling process, through the UML. The Standard for Digital Geospatial Metadata will be approached, widely used to the dataset cataloging by scientific organizations around the world, and the dynamic and static UML diagrams like use cases, sequence and classes. The development of the information system tools can be a way to promote the scientific data organization and dissemination. However, the modeling process requires special attention during the development of interfaces that will stimulate the use of the information system tools

  6. Automatic TLI recognition system, user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report describes how to use an automatic target recognition system (version 14). In separate volumes are a general description of the ATR system, Automatic TLI Recognition System, General Description, and a programmer`s manual, Automatic TLI Recognition System, Programmer`s Guide.

  7. 远程自动抄表系统及其效益分析%LONG-DISTANCE AUTOMATICAL COPY AMMETER DATA SYSTEM AND ITS BENEFICAL ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    覃达

    2000-01-01

    随着我国社会、经济的发展,居民用电越来越普及,且用电量日益剧增,传统的人工抄表收取电费方式,已经不能满足现代化管理的要求。为此,我们根据远程数据通信与台区内载波通信联络相接合的技术原理,开发了远程自动抄表系统,该系统功能齐全,操作方便,安装简易。经过-年多的运行,系统工作稳定,计费及时、准确、可靠,产生了显著的经济效益。本文介绍系统的设计思想和系统各部分单元的工作原理及其技术性能和特点。%With social economic develop ment electric power is used by dwellers popularly and the amount of electric power is used increasingly. Traditional charge of electric fare by manual labour can't satisfy the demand for modem managrnent. Thus according to teachnical principle of jointing teledata communications with station carrier wave liaison, we had devised long-distance automatic copy ammeter data system. This system function is complete, its operation is convenient, and its install ment is facile. After one year operation, the system' s steady, the counting is in time with credibility, bringing remarkable economical benefit. In this paper we introduce systematic designing idea and work principle of every unit and technical capability.

  8. Intelligent Management System of Power Network Information Collection Under Big Data Storage

    Directory of Open Access Journals (Sweden)

    Qin Yingying

    2017-01-01

    Full Text Available With the development of economy and society, big data storage in enterprise management has become a problem that can’t be ignored. How to manage and optimize the allocation of tasks better is an important factor in the sustainable development of an enterprise. Now the enterprise information intelligent management has become a hot spot of management mode and concept in the information age. It presents information to the business managers in a more efficient, lower cost, and global form. The system uses the SG-UAP development tools, which is based on Eclipse development environment, and suits for Windows operating system, with Oracle as database development platform, Tomcat network information service for application server. The system uses SOA service-oriented architecture, provides RESTful style service, and HTTP(S as the communication protocol, and JSON as the data format. The system is divided into two parts, the front-end and the backs-end, achieved functions like user login, registration, password retrieving, enterprise internal personnel information management and internal data display and other functions.

  9. A Framework for Detecting Fraudulent Activities in EDO State Tax Collection System Using Investigative Data Mining

    Directory of Open Access Journals (Sweden)

    Okoro F. M

    2016-05-01

    Full Text Available Edo State Inland Revenue Services is overwhelmed with gigabyte of disk capacity containing data about tax payers’ in the state. The data stored on the database increases in size at an alarming rate. This has resulted in a data rich but information poor situation where there is a widening gap between the explosive growth of data and its types, and the ability to analyze and interpret it effectively; hence the need for a new generation of automated and intelligent tools and techniques known as investigative data mining, to look for patterns in data. These patterns can lead to new insights, competitive advantages for business, and tangible benefits for the State Revenue services. This research work focuses on designing effective fraud detection and deterring architecture using investigative data mining technique. The proposed system architecture is designed to reason using Artificial Neural Network and Machine learning algorithm in order to detect and deter fraudulent activities. We recommend that the architectural framework be developed using Object Oriented Programming and Agent Oriented Programming Languages.

  10. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    Energy Technology Data Exchange (ETDEWEB)

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  11. PLC Based Automatic Multistoried Car Parking System

    OpenAIRE

    2014-01-01

    This project work presents the study and design of PLC based Automatic Multistoried Car Parking System. Multistoried car parking is an arrangement which is used to park a large number of vehicles in least possible place. For making this arrangement in a real plan very high technological instruments are required. In this project a prototype of such a model is made. This prototype model is made for accommodating twelve cars at a time. Availability of the space for parking is detecte...

  12. Kerman Photovoltaic Power Plant R&D data collection computer system operations and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, P.B.

    1994-06-01

    The Supervisory Control and Data Acquisition (SCADA) system at the Kerman PV Plant monitors 52 analog, 44 status, 13 control, and 4 accumulator data points in real-time. A Remote Terminal Unit (RTU) polls 7 peripheral data acquisition units that are distributed throughout the plant once every second, and stores all analog, status, and accumulator points that have changed since the last scan. The R&D Computer, which is connected to the SCADA RTU via a RS-232 serial link, polls the RTU once every 5-7 seconds and records any values that have changed since the last scan. A SCADA software package called RealFlex runs on the R&D computer and stores all updated data values taken from the RTU, along with a time-stamp for each, in a historical real-time database. From this database, averages of all analog data points and snapshots of all status points are generated every 10 minutes and appended to a daily file. These files are downloaded via modem by PVUSA/Davis staff every day, and the data is placed into the PVUSA database.

  13. A fully automatic system for acid-base coulometric titrations

    OpenAIRE

    1990-01-01

    An automatic system for acid-base titrations by electrogeneration of H+ and OH- ions, with potentiometric end-point detection, was developed. The system includes a PC-compatible computer for instrumental control, data acquisition and processing, which allows up to 13 samples to be analysed sequentially with no human intervention. The system performance was tested on the titration of standard solutions, which it carried out with low errors and RSD. It was subsequently applied to the analysis o...

  14. 29 CFR 1910.159 - Automatic sprinkler systems.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 5 2010-07-01 2010-07-01 false Automatic sprinkler systems. 1910.159 Section 1910.159... § 1910.159 Automatic sprinkler systems. (a) Scope and application. (1) The requirements of this section apply to all automatic sprinkler systems installed to meet a particular OSHA standard. (2) For...

  15. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and...

  16. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and...

  17. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  18. Sistema de aquisição automatica de dados para o gerenciamento de operações mecanizadas Automatic data acquisition system for mechanization management

    Directory of Open Access Journals (Sweden)

    Gastão Moraes da Silveira

    2005-01-01

    Full Text Available Para difundir e popularizar técnicas de gestão de informações na agricultura são necessários esforços para prover o usuário de ferramentas de análise e de avaliação operacional. A versatilidade do trator agrícola o torna uma enorme fonte de informações que, se bem obtidas, analisadas e interpretadas, podem subsidiar o gerenciamento operacional da propriedade agrícola. Este trabalho apresenta o desenvolvimento de um sistema de aquisição automática de dados projetados para o levantamento de informações de campo. O sistema foi concebido para determinar a posição do veículo através de sistema de posicionamento global (GPS, juntamente com o consumo de combustível, a rotação do motor e a velocidade de deslocamento. Foram realizados experimentos visando verificar o funcionamento do sistema, por meio de da confrontação com outras técnicas de determinação do consumo de combustível, da rotação do motor e da velocidade de deslocamento. Concluiu-se que o consumo de combustível, não exige correção de dados, e a determinação da rotação do motor requer calibração.To spread out and to popularize the information management techniques in agriculture, efforts are necessary to provide the user with analysis and operational evaluation tools. The versatility of agricultural tractor becomes an enormous source of information; if it is well obtained, analyzed and interpreted, it can subsidize the farm management. This work presents the development of an automatic acquisition data system projected for field information. The system was conceived to determine the position of the vehicle using global positioning system (GPS, fuel consumption , engine rotation, and forward speed. This data set permits the statistical control the operational parameters and generates reports with the main management indicators as field capacity, and work conditions. Its spatial treatment makes possible the creation of a database that associated to

  19. A Study on Automatic Scoring for Machine Translation Systems

    Institute of Scientific and Technical Information of China (English)

    Yao Jianmin(姚建民); Zhang Jing; Zhao Tiejun; Li Sheng

    2004-01-01

    String similarity measures of edit distance, cosine correlation and Dice coefficient are adopted to evaluate machine translation results. Experiment shows that the evaluation method distinguishes well between "good" and "bad" translations. Another experiment manifests a consistency between human and automatic scorings of 6 general-purpose MT systems. Equational analysis validates the experimental results. Although the data and graphs are very promising, correlation coefficient and significance tests at 0.01 level are made to ensure the reliability of the results. Linear regression is made to map the automatic scoring results to human scorings.

  20. Applying dynamic data collection to improve dry electrode system performance for a P300-based brain-computer interface

    Science.gov (United States)

    Clements, J. M.; Sellers, E. W.; Ryan, D. B.; Caves, K.; Collins, L. M.; Throckmorton, C. S.

    2016-12-01

    Objective. Dry electrodes have an advantage over gel-based ‘wet’ electrodes by providing quicker set-up time for electroencephalography recording; however, the potentially poorer contact can result in noisier recordings. We examine the impact that this may have on brain-computer interface communication and potential approaches for mitigation. Approach. We present a performance comparison of wet and dry electrodes for use with the P300 speller system in both healthy participants and participants with communication disabilities (ALS and PLS), and investigate the potential for a data-driven dynamic data collection algorithm to compensate for the lower signal-to-noise ratio (SNR) in dry systems. Main results. Performance results from sixteen healthy participants obtained in the standard static data collection environment demonstrate a substantial loss in accuracy with the dry system. Using a dynamic stopping algorithm, performance may have been improved by collecting more data in the dry system for ten healthy participants and eight participants with communication disabilities; however, the algorithm did not fully compensate for the lower SNR of the dry system. An analysis of the wet and dry system recordings revealed that delta and theta frequency band power (0.1-4 Hz and 4-8 Hz, respectively) are consistently higher in dry system recordings across participants, indicating that transient and drift artifacts may be an issue for dry systems. Significance. Using dry electrodes is desirable for reduced set-up time; however, this study demonstrates that online performance is significantly poorer than for wet electrodes for users with and without disabilities. We test a new application of dynamic stopping algorithms to compensate for poorer SNR. Dynamic stopping improved dry system performance; however, further signal processing efforts are likely necessary for full mitigation.

  1. Recent advances in automatic alignment system for the National Iginition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, K; Awwal, A; Kalantar, D; Leach, R; Lowe-Webb, R; McGuigan, D; Kamm, V

    2010-12-08

    The automatic alignment system for the National Ignition Facility (NIF) is a large-scale parallel system that directs all 192 laser beams along the 300-m optical path to a 50-micron focus at target chamber in less than 50 minutes. The system automatically commands 9,000 stepping motors to adjust mirrors and other optics based upon images acquired from high-resolution digital cameras viewing beams at various locations. Forty-five control loops per beamline request image processing services running on a LINUX cluster to analyze these images of the beams and references, and automaticallys teer the beams toward the target. This paper discusses the upgrades to the NIF automatic alignment system to handle new alignment needs and evolving requirements as related to various types of experiments performed. As NIF becomes a continuously-operated system and more experiments are performed, performance monitoring is increasingly important for maintenance and commissioning work. Data, collected during operations, is analyzed for tuning of the laser and targeting maintenance work. handling evolving alignment and maintenance needs is expected for the planned 30-year operational life of NIF.

  2. An Automatic Indirect Immunofluorescence Cell Segmentation System

    Directory of Open Access Journals (Sweden)

    Yung-Kuan Chan

    2014-01-01

    Full Text Available Indirect immunofluorescence (IIF with HEp-2 cells has been used for the detection of antinuclear autoantibodies (ANA in systemic autoimmune diseases. The ANA testing allows us to scan a broad range of autoantibody entities and to describe them by distinct fluorescence patterns. Automatic inspection for fluorescence patterns in an IIF image can assist physicians, without relevant experience, in making correct diagnosis. How to segment the cells from an IIF image is essential in developing an automatic inspection system for ANA testing. This paper focuses on the cell detection and segmentation; an efficient method is proposed for automatically detecting the cells with fluorescence pattern in an IIF image. Cell culture is a process in which cells grow under control. Cell counting technology plays an important role in measuring the cell density in a culture tank. Moreover, assessing medium suitability, determining population doubling times, and monitoring cell growth in cultures all require a means of quantifying cell population. The proposed method also can be used to count the cells from an image taken under a fluorescence microscope.

  3. Automatic-Control System for Safer Brazing

    Science.gov (United States)

    Stein, J. A.; Vanasse, M. A.

    1986-01-01

    Automatic-control system for radio-frequency (RF) induction brazing of metal tubing reduces probability of operator errors, increases safety, and ensures high-quality brazed joints. Unit combines functions of gas control and electric-power control. Minimizes unnecessary flow of argon gas into work area and prevents electrical shocks from RF terminals. Controller will not allow power to flow from RF generator to brazing head unless work has been firmly attached to head and has actuated micro-switch. Potential shock hazard eliminated. Flow of argon for purging and cooling must be turned on and adjusted before brazing power applied. Provision ensures power not applied prematurely, causing damaged work or poor-quality joints. Controller automatically turns off argon flow at conclusion of brazing so potentially suffocating gas does not accumulate in confined areas.

  4. 24 CFR 902.60 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Data collection. 902.60 Section 902... PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.60 Data collection. (a) Fiscal Year reporting period... transmission of the data. (c) Financial condition information. Year-end financial information to conduct...

  5. A comprehensive collection of systems biology data characterizing the host response to viral infection.

    Science.gov (United States)

    Aevermann, Brian D; Pickett, Brett E; Kumar, Sanjeev; Klem, Edward B; Agnihothram, Sudhakar; Askovich, Peter S; Bankhead, Armand; Bolles, Meagen; Carter, Victoria; Chang, Jean; Clauss, Therese R W; Dash, Pradyot; Diercks, Alan H; Eisfeld, Amie J; Ellis, Amy; Fan, Shufang; Ferris, Martin T; Gralinski, Lisa E; Green, Richard R; Gritsenko, Marina A; Hatta, Masato; Heegel, Robert A; Jacobs, Jon M; Jeng, Sophia; Josset, Laurence; Kaiser, Shari M; Kelly, Sara; Law, G Lynn; Li, Chengjun; Li, Jiangning; Long, Casey; Luna, Maria L; Matzke, Melissa; McDermott, Jason; Menachery, Vineet; Metz, Thomas O; Mitchell, Hugh; Monroe, Matthew E; Navarro, Garnet; Neumann, Gabriele; Podyminogin, Rebecca L; Purvine, Samuel O; Rosenberger, Carrie M; Sanders, Catherine J; Schepmoes, Athena A; Shukla, Anil K; Sims, Amy; Sova, Pavel; Tam, Vincent C; Tchitchek, Nicolas; Thomas, Paul G; Tilton, Susan C; Totura, Allison; Wang, Jing; Webb-Robertson, Bobbie-Jo; Wen, Ji; Weiss, Jeffrey M; Yang, Feng; Yount, Boyd; Zhang, Qibin; McWeeney, Shannon; Smith, Richard D; Waters, Katrina M; Kawaoka, Yoshihiro; Baric, Ralph; Aderem, Alan; Katze, Michael G; Scheuermann, Richard H

    2014-01-01

    The Systems Biology for Infectious Diseases Research program was established by the U.S. National Institute of Allergy and Infectious Diseases to investigate host-pathogen interactions at a systems level. This program generated 47 transcriptomic and proteomic datasets from 30 studies that investigate in vivo and in vitro host responses to viral infections. Human pathogens in the Orthomyxoviridae and Coronaviridae families, especially pandemic H1N1 and avian H5N1 influenza A viruses and severe acute respiratory syndrome coronavirus (SARS-CoV), were investigated. Study validation was demonstrated via experimental quality control measures and meta-analysis of independent experiments performed under similar conditions. Primary assay results are archived at the GEO and PeptideAtlas public repositories, while processed statistical results together with standardized metadata are publically available at the Influenza Research Database (www.fludb.org) and the Virus Pathogen Resource (www.viprbrc.org). By comparing data from mutant versus wild-type virus and host strains, RNA versus protein differential expression, and infection with genetically similar strains, these data can be used to further investigate genetic and physiological determinants of host responses to viral infection.

  6. Collective Analysis of Qualitative Data

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Friberg, Karin

    2014-01-01

    What. Many students and practitioners do not know how to systematically process qualitative data once it is gathered—at least not as a collective effort. This chapter presents two workshop techniques, affinity diagramming and diagnostic mapping, that support collective analysis of large amounts...... of qualitative data. Affinity diagramming is used to make collective analysis and interpretations of qualitative data to identify core problems that need to be addressed in the design process. Diagnostic mapping supports collective interpretation and description of these problems and how to intervene in them. We....... In particular, collective analysis can be used to identify, understand, and act on complex design problems that emerge, for example, after the introduction of new tech- nologies. Such problems might be hard to clarify, and the basis for the analysis often involves large amounts of unstructured qualitative data...

  7. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  8. Quality assessment of automatically extracted data from GPs' EPR.

    Science.gov (United States)

    de Clercq, Etienne; Moreels, Sarah; Van Casteren, Viviane; Bossuyt, Nathalie; Goderis, Geert; Bartholomeeusen, Stefaan

    2012-01-01

    There are many secondary benefits to collecting routine primary care data, but we first need to understand some of the properties of this data. In this paper we describe the method used to assess the PPV and sensitivity of data extracted from Belgian GPs' EPR (diagnoses, drug prescriptions, referrals, and certain parameters), using data collected through an electronic questionnaire as a gold standard. We describe the results of the ResoPrim phase 2 project, which involved 4 software systems and 43 practices (10,307 patients). This method of assessment could also be applied to other research networks.

  9. AUTOMATIC ROAD SIGN INVENTORY USING MOBILE MAPPING SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Soilán

    2016-06-01

    Full Text Available The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS, thus, road sign information can be available to be used in a Smart City context.

  10. Web Service Interface for Data Collection

    Directory of Open Access Journals (Sweden)

    Ruchika

    2012-05-01

    Full Text Available Data collection is a key component of an information system. The widespread penetration of ICT tools in organizations and institutions has resulted in a shift in the way the data is collected. Data may be collected in printed-form, by e-mails, on a compact disk, or, by direct upload on the management information system. Since web services are platform-independent, it can access data stored in the XML format from any platform. In this paper, we present an interface which uses web services for data collection. It requires interaction between a web service deployed for the purposes of data collection, and the web address where the data is stored. Our interface requires that the web service has pre-knowledge of the address from where the data is to be collected. Also, the data to be accessed must be stored in XML format. Since our interface uses computer-supported interaction on both sides, it eases the task of regular and ongoing data collection. We apply our framework to the Education Management Information System, which collects data from schools spread across the country.

  11. Automatic Railway Power Line Extraction Using Mobile Laser Scanning Data

    Science.gov (United States)

    Zhang, Shanxin; Wang, Cheng; Yang, Zhuang; Chen, Yiping; Li, Jonathan

    2016-06-01

    Research on power line extraction technology using mobile laser point clouds has important practical significance on railway power lines patrol work. In this paper, we presents a new method for automatic extracting railway power line from MLS (Mobile Laser Scanning) data. Firstly, according to the spatial structure characteristics of power-line and trajectory, the significant data is segmented piecewise. Then, use the self-adaptive space region growing method to extract power lines parallel with rails. Finally use PCA (Principal Components Analysis) combine with information entropy theory method to judge a section of the power line whether is junction or not and which type of junction it belongs to. The least squares fitting algorithm is introduced to model the power line. An evaluation of the proposed method over a complicated railway point clouds acquired by a RIEGL VMX450 MLS system shows that the proposed method is promising.

  12. Automatic focusing system of BSST in Antarctic

    Science.gov (United States)

    Tang, Peng-Yi; Liu, Jia-Jing; Zhang, Guang-yu; Wang, Jian

    2015-10-01

    Automatic focusing (AF) technology plays an important role in modern astronomical telescopes. Based on the focusing requirement of BSST (Bright Star Survey Telescope) in Antarctic, an AF system is set up. In this design, functions in OpenCV is used to find stars, the algorithm of area, HFD or FWHM are used to degree the focus metric by choosing. Curve fitting method is used to find focus position as the method of camera moving. All these design are suitable for unattended small telescope.

  13. Distributed privacy preserving data collection

    KAUST Repository

    Xue, Mingqiang

    2011-01-01

    We study the distributed privacy preserving data collection problem: an untrusted data collector (e.g., a medical research institute) wishes to collect data (e.g., medical records) from a group of respondents (e.g., patients). Each respondent owns a multi-attributed record which contains both non-sensitive (e.g., quasi-identifiers) and sensitive information (e.g., a particular disease), and submits it to the data collector. Assuming T is the table formed by all the respondent data records, we say that the data collection process is privacy preserving if it allows the data collector to obtain a k-anonymized or l-diversified version of T without revealing the original records to the adversary. We propose a distributed data collection protocol that outputs an anonymized table by generalization of quasi-identifier attributes. The protocol employs cryptographic techniques such as homomorphic encryption, private information retrieval and secure multiparty computation to ensure the privacy goal in the process of data collection. Meanwhile, the protocol is designed to leak limited but non-critical information to achieve practicability and efficiency. Experiments show that the utility of the anonymized table derived by our protocol is in par with the utility achieved by traditional anonymization techniques. © 2011 Springer-Verlag.

  14. Water Column Sonar Data Collection

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The collection and analysis of water column sonar data is a relatively new avenue of research into the marine environment. Primary uses include assessing biological...

  15. Application of Data Transmission Technology in the Automatic Fire Alarm System%数据通信传输技术在火灾自动报警系统中的运用

    Institute of Scientific and Technical Information of China (English)

    梁鸿

    2011-01-01

    At present the fire damage degree and endanger scope are on the rise. In order to reduce the loss of degree, countnes in the world develop automatic fire alarm system. Data transmission technology is a very important technique in the automatic fire alarm system. Combining with automatic fire alarm system from data transmission mode. string/parallel communication mode. flyers/duplex data transmission. data synchronization method, digital coding method, the article introduced the technology.%目前,火灾损失程度和危害范围呈上升趋势.为了降低火灾损失的程度,世界各国都在积极开发火灾自动报警系统.数据通信传输技术是在火灾自动报警系统中非常重要的一种技术.本文结合火灾自动报警系统从数据通信传输模式,串/并行通信方式、传单/双工数据传输方式、数据同步方式、数字编码方式等方面介绍这一技术.

  16. Modeling of a data exchange process in the Automatic Process Control System on the base of the universal SCADA-system

    Science.gov (United States)

    Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.

    2016-04-01

    In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.

  17. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  18. Quality assurance for screening mammography data collection systems in 22 countries.

    NARCIS (Netherlands)

    Klabunde, C.N.; Sancho-Garnier, H.; Broeders, M.E.A.C.; Thoresen, S.; Rodrigues, V.J.; Ballard-Barbash, R.

    2001-01-01

    OBJECTIVES: To document the mammography data that are gathered by the organized screening programs participating in the International Breast Cancer Screening Network (IBSN), the nature of their procedures for data quality assurance, and the measures used to assess program performance and impact. MET

  19. Platform attitude data acquisition system

    Digital Repository Service at National Institute of Oceanography (India)

    Afzulpurkar, S.

    A system for automatic acquisition of underwater platform attitude data has been designed, developed and tested in the laboratory. This is a micro controller based system interfacing dual axis inclinometer, high-resolution digital compass...

  20. Automatic data acquisition of anthropological measurements

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O

    1993-01-01

    A computer program in BASIC is presented which enables the input of measurement data from a caliper directly into specific records in a dBASE IV or PARADOX database. The program circumvents the tedious procedure of first recording measurement data manually and then entering the data into a computer....... Thus much time can be saved and the risk of wrong data entry is lowered. The program was easy to use, and no significant problems were encountered. Necessary hardware is a standard IBM compatible desktop computer, Mitotoyu Digimatic (TM) calipers and a Mitotoyu Digimatic MUX-10 Multiplexer (TM)....

  1. Advanced instrumentation for the collection, retrieval, and processing of urban stormwater data

    Science.gov (United States)

    Robinson, Jerald B.; Bales, Jerad D.; Young, Wendi S.; ,

    1995-01-01

    The U.S. Geological Survey, in cooperation with the City of Charlotte and Mecklenburg County, North Carolina, has developed a data-collection network that uses advanced instrumentation to automatically collect, retrieve, and process urban stormwater data. Precipitation measurement and water-quality networks provide data for (1) planned watershed simulation models, (2) early warning of possible flooding, (3) computation of material export, and (4) characterization of water quality in relation to basin conditions. Advantages of advanced instrumentation include remote access to real-time data, reduced demands on and more efficient use of limited human resources, and direct importation of data into a geographical information system for display and graphic analysis.

  2. Time Synchronization Module for Automatic Identification System

    Institute of Scientific and Technical Information of China (English)

    Choi Il-heung; Oh Sang-heon; Choi Dae-soo; Park Chan-sik; Hwang Dong-hwan; Lee Sang-jeong

    2003-01-01

    This paper proposed a design and implementation procedure of the Time Synchronization Module (TSM) for the Automatic Identification System (AIS). The proposed TSM module uses a Temperature Compensated Crystal Oscillator (TCXO) as a local reference clock, and consists of a Digitally Controlled Oscillator (DCO), a divider, a phase discriminator, and register blocks. The TSM measures time difference between the 1 PPS from the Global Navigation Satellite System (GNSS) receiver and the generated transmitter clock. The measured time difference is compensated by controlling the DCO and the transmit clock is synchronized to the Universal Time Coordinated (UTC). The designed TSM can also be synchronized to the reference time derived from the received message. The proposed module is tested using the experimental AIS transponder set. The experimental results show that the proposed module satisfies the functional and timing specification of the AIS technical standard, ITU-R M.1371.

  3. Automatic system for corneal ulcer diagnostic: II

    Science.gov (United States)

    Ventura, Liliane; Chiaradia, Caio; Faria de Sousa, Sidney J.

    1998-06-01

    Corneal Ulcer is a deepithelization of the cornea and it is a very common disease in agricultural countries. The clinician most used parameter in order to identify a favorable ulcer evolution is the regress of the affected area. However, this kind of evaluation is subjective, once just the horizontal and vertical axes are measured based on a graduated scale and the affected area is estimated. Also, the registration of the disease is made by photographs. In order to overcome the subjectiveness and to register the images in a more accessible way (hard disks, floppy disks, etc.), we have developed an automatic system in order to evaluate the affected area (the ulcer). An optical system is implemented in a Slit Lamp (SL) and connected to a CCD detector. The image is displayed in PC monitor by a commercial frame grabber and a dedicated software for determining the area of the ulcer (precision of 20 mm) has been developed.

  4. Next Generation Data Collection System for One-Pass Detection and Discrimination

    Science.gov (United States)

    2011-12-01

    necessary synchronization. Positional data was acquired by a NovAtel DL-4 GPS receiver in Real Time Kinematic ( RTK ) mode that utilized real time...of the Novatel SPAN in RTK mode (for navigation) required the establishment of a GPS base-station that transmitted real-time corrections to the rover...all three orthogonal transmitters). GPS and INS data, including RTK corrected and raw, were stored directly on the flash-card within the Novatel SPAN

  5. Data collection system. Volume 1, Overview and operators manual; Volume 2, Maintenance manual; Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, R.B.; Bauder, M.E.; Boyer, W.B.; French, R.E.; Isidoro, R.J.; Kaestner, P.C.; Perkins, W.G.

    1993-09-01

    Sandia National Laboratories (SNL) Instrumentation Development Department was tasked by the Defense Nuclear Agency (DNA) to record data on Tektronix RTD720 Digitizers on the HUNTERS TROPHY field test conducted at the Nevada Test Site (NTS) on September 18, 1992. This report contains a overview and description of the computer hardware and software that was used to acquire, reduce, and display the data. The document is divided into two volumes: an overview and operators manual (Volume 1) and a maintenance manual (Volume 2).

  6. Evaluation of a Teleform-based data collection system: A multi-center obesity research case study

    Science.gov (United States)

    Jenkins, Todd M.; Boyce, Tawny Wilson; Akers, Rachel; Andringa, Jennifer; Liu, Yanhong; Miller, Rosemary; Powers, Carolyn; Buncher, C. Ralph

    2016-01-01

    Utilizing electronic data capture (EDC) systems in data collection and management allows automated validation programs to preemptively identify and correct data errors. For our multi-center, prospective study we chose to use TeleForm, a paper-based data capture software that uses recognition technology to create case report forms (CRFs) with similar functionality to EDC, including custom scripts to identify entry errors. We quantified the accuracy of the optimized system through a data audit of CRFs and the study database, examining selected critical variables for all subjects in the study, as well as an audit of all variables for 25 randomly selected subjects. Overall we found 6.7 errors per 10,000 fields, with similar estimates for critical (6.9/10,000) and non-critical (6.5/10,000) variables – values that fall below the acceptable quality threshold of 50 errors per 10,000 established by the Society for Clinical Data Management. However, error rates were found to widely vary by type of data field, with the highest rate observed with open text fields. PMID:24709056

  7. A comprehensive collection of systems biology data characterizing the host response to viral infection

    OpenAIRE

    Aevermann, Brian D.; Pickett, Brett E.; Kumar, Sanjeev; Klem, Edward B.; Agnihothram, Sudhakar; Peter S Askovich; III, Armand Bankhead; Bolles, Meagen; Carter, Victoria; Chang, Jean; Clauss, Therese R.W.; Dash, Pradyot; Diercks, Alan H.; Eisfeld, Amie J.; Ellis, Amy

    2014-01-01

    The Systems Biology for Infectious Diseases Research program was established by the U.S. National Institute of Allergy and Infectious Diseases to investigate host-pathogen interactions at a systems level. This program generated 47 transcriptomic and proteomic datasets from 30 studies that investigate in vivo and in vitro host responses to viral infections. Human pathogens in the Orthomyxoviridae and Coronaviridae families, especially pandemic H1N1 and avian H5N1 influenza A viruses and severe...

  8. 76 FR 58301 - Proposed Extension of Existing Information Collection; Automatic Fire Sensor and Warning Device...

    Science.gov (United States)

    2011-09-20

    ... Sensor and Warning Device Systems; Examination and Test Requirements ACTION: Notice of request for public... Coal Mining. OMB 1219-0145 has been renamed Automatic Fire Sensor and Warning Device Systems... to a task in July 2011; OMB 1219-0073 subsumed Sec. 75.1103-5(a)(2)(ii) Automatic fire sensor...

  9. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  10. Advanced Data Collection for Inventory Management

    Science.gov (United States)

    Opresko, G. A.; Leet, J. H.; Mcgrath, D. F.; Eidson, J.

    1987-01-01

    Bar-coding, radio-frequency, and voice-operated systems selected. Report discusses study of state-of-the-art in automated collection of data for management of large inventories. Study included comprehensive search of literature on data collection and inventory management, visits to existing automated inventory systems, and tours of selected supply and transportation facilities at Kennedy Space Center. Information collected analyzed in view of needs of conceptual inventory-management systems for Kennedy Space Center and for manned space station and other future space projects.

  11. 49 CFR 236.825 - System, automatic train control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will...

  12. Method and systems for collecting data from multiple fields of view

    Science.gov (United States)

    Schwemmer, Geary K. (Inventor)

    2002-01-01

    Systems and methods for processing light from multiple fields (48, 54, 55) of view without excessive machinery for scanning optical elements. In an exemplary embodiment of the invention, multiple holographic optical elements (41, 42, 43, 44, 45), integrated on a common film (4), diffract and project light from respective fields of view.

  13. 40 CFR 141.533 - What data must my system collect to calculate a disinfection profile?

    Science.gov (United States)

    2010-07-01

    ... system uses chlorine, the pH of the disinfected water at each residual disinfectant concentration... calculate a disinfection profile? 141.533 Section 141.533 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS...

  14. Improving the automatic wavelength calibration of EMIR spectroscopic data

    Science.gov (United States)

    Cardiel, N.; Pascual, S.; Picazo, P.; Gallego, J.; Garzón, F.; Castro-Rodríguez, N.; González-Fernández, C.; Hammersley, P.; Insausti, M.; Manjavacas, E.; Miluzio, M.

    2017-03-01

    EMIR, the near-infrared camera-spectrograph operating in the near-infrared (NIR) wavelengths 0.9-2.5μm, is being commissioned at the Nasmyth focus of the Gran Telescopio CANARIAS. One of the most outstanding capabilities of EMIR will be its multi-object spectroscopic mode which, with the help of a robotic reconfigurable slit system, will allow to take around 53 spectra simultaneously. A data reduction pipeline, PyEmir, based on Python, is being developed in order to facilitate the automatic reduction of EMIR data taken in both imaging and spectroscopy mode. Focusing on the reduction of spectroscopic data, some critical manipulations include the geometric distortion correction and the wavelength calibration. Although usually these reductions steps are carried out separately, it is important to realise that these kind of manipulations involve data rebinning and interpolation, which in addition unavoidably lead to the increase of error correlation and to resolution degradation. In order to minimise these effects, it is possible to incorporate those data manipulations as a single geometric transformation. This approach is being used in the development of PyEmir. For this purpose, the geometric transformations available in the Python package Scikit-image are being used. This work was funded by the Spanish Programa Nacional de Astronomía y Astrofísica under grant AYA2013-46724-P.

  15. ENT COBRA (Consortium for Brachytherapy Data Analysis): interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy)

    OpenAIRE

    2016-01-01

    Purpose Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. Material and methods GEC-ESTRO (Groupe Européen de Curiethérapie – European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer ...

  16. Developing Automatic System Monitoring Solution for Accanto Systems Customer Care

    OpenAIRE

    Mikkola, Markku

    2015-01-01

    The goal of the development work was to document the requirements, to develop and deploy an automatic system monitoring solution for Accanto Systems Customer Care. This final report describes Icinga Core as the backbone of the monitoring solution and presents the actual use case that was implemented for Accanto Systems. The client for this work was Accanto Systems Customer Care department which had been suffering a long time with high work load due to increased basic system monitoring tas...

  17. QuaDoSta - a freely configurable system which facilitates multi-centric data collection for healthcare and medical research

    Directory of Open Access Journals (Sweden)

    Albrecht, Ulrike

    2007-07-01

    Full Text Available This article describes QuaDoSta (quality assurance, documentation and statistics, a flexible documentation system as well as a data collection and networking platform for medical facilities. The user can freely define the required documentation masks which are easily expandable and can be adapted to individual requirements without the need for additional programming. To avoid duplication, data transfer interfaces can be configured flexibly to external sources such as patient management systems used in surgeries or hospital information systems. The projects EvaMed (Evaluation Anthroposophical Medicine and the Network Oncology are two scientific research projects which have been successfully established as nationally active networks on the basis of QuaDoSta. The EvaMed-Network serves as a modern pharmacovigilance project for the documentation of adverse drug events. All prescription data are electronically recorded to assess the relative risk of drugs. The Network Oncology was set up as a documentation system in four hospitals and seven specialist oncology practices where a complete record of all oncological therapies is being carried out to uniform standards on the basis of the ‘basic documentation for tumour patients’ (BDT developed by the German Cancer Society. The QuaDoSta solution system made it possible to cater for the specific requirements of the presented projects. The following features of the system proved to be highly advantageous: flexible setup of catalogues and user friendly customisation and extensions, complete dissociation of system setup and documentation content, multi-centre networkability, and configurable data transfer interfaces.

  18. An efficient automatic firearm identification system

    Science.gov (United States)

    Chuan, Zun Liang; Liong, Choong-Yeun; Jemain, Abdul Aziz; Ghani, Nor Azura Md.

    2014-06-01

    Automatic firearm identification system (AFIS) is highly demanded in forensic ballistics to replace the traditional approach which uses comparison microscope and is relatively complex and time consuming. Thus, several AFIS have been developed for commercial and testing purposes. However, those AFIS are still unable to overcome some of the drawbacks of the traditional firearm identification approach. The goal of this study is to introduce another efficient and effective AFIS. A total of 747 firing pin impression images captured from five different pistols of same make and model are used to evaluate the proposed AFIS. It was demonstrated that the proposed AFIS is capable of producing firearm identification accuracy rate of over 95.0% with an execution time of less than 0.35 seconds per image.

  19. Automatic charge control system for satellites

    Science.gov (United States)

    Shuman, B. M.; Cohen, H. A.

    1985-01-01

    The SCATHA and the ATS-5 and 6 spacecraft provided insights to the problem of spacecraft charging at geosychronous altitudes. Reduction of the levels of both absolute and differential charging was indicated, by the emission of low energy neutral plasma. It is appropriate to complete the transition from experimental results to the development of a system that will sense the state-of-charge of a spacecraft, and, when a predetermined threshold is reached, will respond automatically to reduce it. A development program was initiated utilizing sensors comparable to the proton electrostatic analyzer, the surface potential monitor, and the transient pulse monitor that flew in SCATHA, and combine these outputs through a microprocessor controller to operate a rapid-start, low energy plasma source.

  20. Automatic Generation of Symbolic Model for Parameterized Synchronous Systems

    Institute of Scientific and Technical Information of China (English)

    Wei-Wen Xu

    2004-01-01

    With the purpose of making the verification of parameterized system more general and easier, in this paper, a new and intuitive language PSL (Parameterized-system Specification Language) is proposed to specify a class of parameterized synchronous systems. From a PSL script, an automatic method is proposed to generate a constraint-based symbolic model. The model can concisely symbolically represent the collections of global states by counting the number of processes in a given state. Moreover, a theorem has been proved that there is a simulation relation between the original system and its symbolic model. Since the abstract and symbolic techniques are exploited in the symbolic model, state-explosion problem in traditional verification methods is efficiently avoided. Based on the proposed symbolic model, a reachability analysis procedure is implemented using ANSI C++ on UNIX platform. Thus, a complete tool for verifying the parameterized synchronous systems is obtained and tested for some cases. The experimental results show that the method is satisfactory.

  1. Automatic warehouse system for roll paper; Makitori jido soko system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-04-20

    This system is a full-automatic warehouse system from storing of delivered roll paper to supplying of such paper to a rotary press for a production line of telephone directories. Main specifications: (1) Handling object: roll base paper, (2) Automatic warehouse: 2 stacker crane (1.25t x H15.55m), (3) Storage capacity: 496 stacks, (4) Peripheral facility: receiving facility (table lifter, stopper, slat CV, receiving truck), shipment facility (shipment table, slat CV, stopper), and paper transfer facility (mirror surface/lump peeling unit, preparation unit, AGV). Feature: (1) Automatic paper winding mechanism to a rotary press without any worker by a stacker crane, mirror surface/lump peeling unit, preparation unit, and AGV. (translated by NEDO)

  2. Human-system Interfaces for Automatic Systems

    Energy Technology Data Exchange (ETDEWEB)

    OHara, J.M.; Higgins,J. (BNL); Fleger, S.; Barnes V. (NRC)

    2010-11-07

    Automation is ubiquitous in modern complex systems, and commercial nuclear- power plants are no exception. Automation is applied to a wide range of functions including monitoring and detection, situation assessment, response planning, and response implementation. Automation has become a 'team player' supporting personnel in nearly all aspects of system operation. In light of its increasing use and importance in new- and future-plants, guidance is needed to conduct safety reviews of the operator's interface with automation. The objective of this research was to develop such guidance. We first characterized the important HFE aspects of automation, including six dimensions: levels, functions, processes, modes, flexibility, and reliability. Next, we reviewed literature on the effects of all of these aspects of automation on human performance, and on the design of human-system interfaces (HSIs). Then, we used this technical basis established from the literature to identify general principles for human-automation interaction and to develop review guidelines. The guidelines consist of the following seven topics: automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, our study identified several topics for additional research.

  3. Automatic method for building indoor boundary models from dense point clouds collected by laser scanners.

    Science.gov (United States)

    Valero, Enrique; Adán, Antonio; Cerrada, Carlos

    2012-11-22

    In this paper we present a method that automatically yields Boundary Representation Models (B-rep) for indoors after processing dense point clouds collected by laser scanners from key locations through an existing facility. Our objective is particularly focused on providing single models which contain the shape, location and relationship of primitive structural elements of inhabited scenarios such as walls, ceilings and floors. We propose a discretization of the space in order to accurately segment the 3D data and generate complete B-rep models of indoors in which faces, edges and vertices are coherently connected. The approach has been tested in real scenarios with data coming from laser scanners yielding promising results. We have deeply evaluated the results by analyzing how reliably these elements can be detected and how accurately they are modeled.

  4. Global synthesis and critical evaluation of pharmaceutical data sets collected from river systems.

    Science.gov (United States)

    Hughes, Stephen R; Kay, Paul; Brown, Lee E

    2013-01-15

    Pharmaceuticals have emerged as a major group of environmental contaminants over the past decade but relatively little is known about their occurrence in freshwaters compared to other pollutants. We present a global-scale analysis of the presence of 203 pharmaceuticals across 41 countries and show that contamination is extensive due to widespread consumption and subsequent disposal to rivers. There are clear regional biases in current understanding with little work outside North America, Europe, and China, and no work within Africa. Within individual countries, research is biased around a small number of populated provinces/states and the majority of research effort has focused upon just 14 compounds. Most research has adopted sampling techniques that are unlikely to provide reliable and representative data. This analysis highlights locations where concentrations of antibiotics, cardiovascular drugs, painkillers, contrast media, and antiepileptic drugs have been recorded well above thresholds known to cause toxic effects in aquatic biota. Studies of pharmaceutical occurrence and effects need to be seen as a global research priority due to increasing consumption, particularly among societies with aging populations. Researchers in all fields of environmental management need to work together more effectively to identify high risk compounds, improve the reliability and coverage of future monitoring studies, and develop new mitigation measures.

  5. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  6. System for Automatic Generation of Examination Papers in Discrete Mathematics

    Science.gov (United States)

    Fridenfalk, Mikael

    2013-01-01

    A system was developed for automatic generation of problems and solutions for examinations in a university distance course in discrete mathematics and tested in a pilot experiment involving 200 students. Considering the success of such systems in the past, particularly including automatic assessment, it should not take long before such systems are…

  7. 2013 International Conference on Mechatronics and Automatic Control Systems

    CERN Document Server

    2014-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the 2013 International Conference on Mechatronics and Automatic Control Systems held in Hangzhou, China on August 10-11, 2013. .

  8. The BENTO Box: Development and field-testing of a new satellite-linked data collection system for multiparameter volcano monitoring

    Science.gov (United States)

    Roman, D. C.; Behar, A.; Elkins-Tanton, L. T.

    2014-12-01

    Predicting volcanic activity requires continuous monitoring for signals of magmatic unrest in harsh, often remote environments. BENTO is a next-generation monitoring system, currently in prototype testing, that is highly portable, low-cost, rapidly deployable, and entirely autonomous. Such a system could be used to provide critical monitoring and data collection capabilities during rapid-onset eruptions, or to provide a crude baseline monitor at large numbers of remote volcanoes to 'flag' the onset of unrest so that costlier resources such as specialized instrumentation can be deployed in the appropriate place at the appropriate time. The BENTO 1 (low-rate data) prototype currently comprises off-the-shelf volcanic gas sensors (SO2, CO2, Fl, Cl, and Br), a weather station (temperature, wind speed, wind direction, rainfall, humidity, pressure), and telemetry via Iridium modem. In baseline mode, BENTO 1 takes a measurement from all of its sensors every two hours and automatically sends the measurements through Iridium to a server that posts them to a dedicated and customizable web page. The measurement interval and other sensor parameters (pumping time, sensor constants) can be adjusted directly or remotely (through the Iridium network) as needed. Currently, BENTO 1 is deployed at Mt. Etna, Italy; Telica Volcano, Nicaragua, Hengill Volcano, Iceland; and Hekla Volcano, Iceland. The BENTO 2 (high-rate) system is motivated by a need to avoid having to telemeter raw seismic data, which at 20-100 Hz/channel is far too voluminous for cost- and power-effective transmission through satellite networks such as Iridium. Our solution is to regularly transmit only state-of-health information and descriptions of the seismic data (e.g., 'triggered' seismic event rates and amplitudes), rather than the data itself. The latter can be accomplished through on-board data analysis and reduction at the installation site. Currently, it is possible to request specific time segments of raw

  9. Fast Automatic Precision Tree Models from Terrestrial Laser Scanner Data

    Directory of Open Access Journals (Sweden)

    Mathias Disney

    2013-01-01

    Full Text Available This paper presents a new method for constructing quickly and automatically precision tree models from point clouds of the trunk and branches obtained by terrestrial laser scanning. The input of the method is a point cloud of a single tree scanned from multiple positions. The surface of the visible parts of the tree is robustly reconstructed by making a flexible cylinder model of the tree. The thorough quantitative model records also the topological branching structure. In this paper, every major step of the whole model reconstruction process, from the input to the finished model, is presented in detail. The model is constructed by a local approach in which the point cloud is covered with small sets corresponding to connected surface patches in the tree surface. The neighbor-relations and geometrical properties of these cover sets are used to reconstruct the details of the tree and, step by step, the whole tree. The point cloud and the sets are segmented into branches, after which the branches are modeled as collections of cylinders. From the model, the branching structure and size properties, such as volume and branch size distributions, for the whole tree or some of its parts, can be approximated. The approach is validated using both measured and modeled terrestrial laser scanner data from real trees and detailed 3D models. The results show that the method allows an easy extraction of various tree attributes from terrestrial or mobile laser scanning point clouds.

  10. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Science.gov (United States)

    Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.

    2016-06-01

    Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  11. Sample collection system for gel electrophoresis

    Science.gov (United States)

    Olivares, Jose A.; Stark, Peter C.; Dunbar, John M.; Hill, Karen K.; Kuske, Cheryl R.; Roybal, Gustavo

    2004-09-21

    An automatic sample collection system for use with an electrophoretic slab gel system is presented. The collection system can be used with a slab gel have one or more lanes. A detector is used to detect particle bands on the slab gel within a detection zone. Such detectors may use a laser to excite fluorescently labeled particles. The fluorescent light emitted from the excited particles is transmitted to low-level light detection electronics. Upon the detection of a particle of interest within the detection zone, a syringe pump is activated, sending a stream of buffer solution across the lane of the slab gel. The buffer solution collects the sample of interest and carries it through a collection port into a sample collection vial.

  12. An automatic system for multidimensional integrated protein chromatography.

    Science.gov (United States)

    Kong, Yingjun; Li, Xiunan; Bai, Gaoying; Ma, Guanghui; Su, Zhiguo

    2010-10-29

    An automatic system for multidimensional integrated protein chromatography was designed for simultaneous separation of multiple proteins from complex mixtures, such as human plasma and tissue lysates. This computer-controlled system integrates several chromatographic columns that work independently or cooperatively with one another to achieve efficient high throughputs. The pipelines can be automatically switched either to another column or to a collection container for each UV-detected elution fraction. Environmental contamination is avoided due to the closed fluid paths and elimination of manual column change. This novel system was successfully used for simultaneous preparation of five proteins from the precipitate of human plasma fraction IV (fraction IV). The system involved gel filtration, ion exchange, hydrophobic interaction, and heparin affinity chromatography. Human serum albumin (HSA), transferrin (Tf), antithrombin-III (AT-III), alpha 1-antitrypsin (α1-AT), and haptoglobin (Hp) were purified within 3 h. The following recovery and purity were achieved: 95% (RSD, 2.8%) and 95% for HSA, 80% (RSD, 2.0%) and 99% for Tf, 70% (RSD, 2.1%) and 99% for AT-III, 65% (RSD, 2.0%) and 94% for α1-AT, and 50% (RSD, 1.0%) and 90% for Hp. The results demonstrate that this novel multidimensional integrated chromatography system is capable of simultaneously separating multiple protein products from the same raw material with high yield and purity and it has the potential for a wide range of multi-step chromatography separation processes.

  13. Automatic control of biomass gasifiers using fuzzy inference systems

    Energy Technology Data Exchange (ETDEWEB)

    Sagues, C. [Universidad de Zaragoza (Spain). Dpto. de Informatica e Ingenieria de Sistemas; Garcia-Bacaicoa, P.; Serrano, S. [Universidad de Zaragoza (Spain). Dpto. de Ingenieria Quimica y Medio Ambiente

    2007-03-15

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated. (author)

  14. Automatic control of biomass gasifiers using fuzzy inference systems.

    Science.gov (United States)

    Sagüés, C; García-Bacaicoa, P; Serrano, S

    2007-03-01

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated.

  15. Automatic Traffic Monitoring from an Airborne Wide Angle Camera System

    OpenAIRE

    Rosenbaum, Dominik; Charmette, Baptiste; Kurz, Franz; Suri, Sahil; Thomas, Ulrike; Reinartz, Peter

    2008-01-01

    We present an automatic traffic monitoring approach using data of an airborne wide angle camera system. This camera, namely the “3K-Camera”, was recently developed at the German Aerospace Center (DLR). It has a coverage of 8 km perpendicular to the flight direction at a flight height of 3000 m with a resolution of 45 cm and is capable to take images at a frame rate of up to 3 fps. Based on georeferenced images obtained from this camera system, a near real-time processing chain containing roa...

  16. New functional units for coke machine automatic control system

    Energy Technology Data Exchange (ETDEWEB)

    Parfenov, G.I.; Bannikov, L.S.; Vakarenko, I.M.; Grishin, S.P.

    1983-01-01

    A new device used in the control systems of coking plants is discussed. The system is capable of operating in fully automatic, semi-automatic, or manual modes. Examples of the usage of the unit include the stopping of coke machines within limits of +/- 200 mm. It is concluded that the use of the units reduce manufacture, adjustment, and service costs.

  17. Automatic Battery Swap System for Home Robots

    Directory of Open Access Journals (Sweden)

    Juan Wu

    2012-12-01

    Full Text Available This paper presents the design and implementation of an automatic battery swap system for the prolonged activities of home robots. A battery swap station is proposed to implement battery off‐line recharging and on‐line exchanging functions. It consists of a loading and unloading mechanism, a shifting mechanism, a locking device and a shell. The home robot is a palm‐sized wheeled robot with an onboard camera and a removable battery case in the front. It communicates with the battery swap station wirelessly through ZigBee. The influences of battery case deflection and robot docking deflection on the battery swap operations have been investigated. The experimental results show that it takes an average time of 84.2s to complete the battery swap operations. The home robot does not have to wait several hours for the batteries to be fully charged. The proposed battery swap system is proved to be efficient in home robot applications that need the robots to work continuously over a long period.

  18. PLC Based Automatic Multistoried Car Parking System

    Directory of Open Access Journals (Sweden)

    Swanand S .Vaze

    2014-12-01

    Full Text Available This project work presents the study and design of PLC based Automatic Multistoried Car Parking System. Multistoried car parking is an arrangement which is used to park a large number of vehicles in least possible place. For making this arrangement in a real plan very high technological instruments are required. In this project a prototype of such a model is made. This prototype model is made for accommodating twelve cars at a time. Availability of the space for parking is detected by optical proximity sensor which is placed on the pallet. A motor controlled elevator is used to lift the cars. Elevator status is indicated by LED which is placed on ground floor. Controlling of the platforms and checking the vacancies is done by PLC. For unparking of car, keyboard is interfaced with the model for selection of required platform. Automation is done to reduce requirement of space and also to reduce human errors, which in-turn results in highest security and greatest flexibility. Due to these advantages, this system can be used in hotels, railway stations, airports where crowding of car is more.

  19. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  20. The automatic calibration of Korean VLBI Network data

    CERN Document Server

    Hodgson, Jeffrey A; Zhao, Guang-Yao; Algaba, Juan-Carlos; Yun, Youngjoo; Jung, Taehyun; Byun, Do-Young

    2016-01-01

    The calibration of Very Long Baseline Interferometry (VLBI) data has long been a time consuming process. The Korean VLBI Network (KVN) is a simple array consisting of three identical antennas. Because four frequencies are observed simultaneously, phase solutions can be transferred from lower frequencies to higher frequencies in order to improve phase coherence and hence sensitivity at higher frequencies. Due to the homogeneous nature of the array, the KVN is also well suited for automatic calibration. In this paper we describe the automatic calibration of single-polarisation KVN data using the KVN Pipeline and comparing the results against VLBI data that has been manually reduced. We find that the pipelined data using phase transfer produces better results than a manually reduced dataset not using the phase transfer. Additionally we compared the pipeline results with a manually reduced phase-transferred dataset and found the results to be identical.

  1. The Automatic Calibration of Korean VLBI Network Data

    Science.gov (United States)

    Hodgson, Jeffrey A.; Lee, Sang-Sung; Zhao, Guang-Yao; Algaba, Juan-Carlos; Yun, Youngjoo; Jung, Taehyun; Byun, Do-Young

    2016-08-01

    The calibration of Very Long Baseline Interferometry (VLBI) data has long been a time consuming process. The Korean VLBI Network (KVN) is a simple array consisting of three identical antennas. Because four frequencies are observed simultaneously, phase solutions can be transferred from lower frequencies to higher frequencies in order to improve phase coherence and hence sensitivity at higher frequencies. Due to the homogeneous nature of the array, the KVN is also well suited for automatic calibration. In this paper we describe the automatic calibration of single-polarisation KVN data using the KVN Pipeline and comparing the results against VLBI data that has been manually reduced. We find that the pipelined data using phase transfer produces better results than a manually reduced dataset not using the phase transfer. Additionally we compared the pipeline results with a manually reduced phase-transferred dataset and found the results to be identical.

  2. Data mining based study on quality of water level data of Three Gorges Reservoir Automatic Dispatching System%基于数据挖掘的三峡水库调度自动化系统水位数据质量研究

    Institute of Scientific and Technical Information of China (English)

    杨旭; 刘宇

    2011-01-01

    三峡水库调度自动化系统负责收集、分析近两百个遥测水位数据,且目前能够以30 s~10 min的周期进行数据采集、传输.但是由于设备、通信等原因,异常数据将会在系统中产生,有时也会缺数,这些因素对数据质量有一定的影响.而在海量数据中进行人工错误数据查找,不太现实.为解决此问题,本文引入完整率和有效性来衡量数据质量,利用数据挖掘技术进行了可行性分析,旨在为解决同类问题提供参考.%Three Gorges Reservoir Automatic Dispatching System is responsible for collecting and analyzing nearly 200 telemetry water level data, and that the data collection and transmission can be made with the frequencies from 30 s to 10 min at present However, some abnormal data will always occur in the system and sometimes even miss some data due to the relevant causations from the equipment, communication, which have certain impacts on the data quality. Nevertheless, it is not realistic to manually find the error data from the related mass data. For solving this problem, the concept of integrity and effectiveness is introduced herein to measure the quality of data, and then a feasibility analysis is made based on the technology of data mining, so as to provide a reference for solving the similar problems concerned

  3. Measuring Service Reliability Using Automatic Vehicle Location Data

    Directory of Open Access Journals (Sweden)

    Zhenliang Ma

    2014-01-01

    Full Text Available Bus service reliability has become a major concern for both operators and passengers. Buffer time measures are believed to be appropriate to approximate passengers' experienced reliability in the context of departure planning. Two issues with regard to buffer time estimation are addressed, namely, performance disaggregation and capturing passengers’ perspectives on reliability. A Gaussian mixture models based method is applied to disaggregate the performance data. Based on the mixture models distribution, a reliability buffer time (RBT measure is proposed from passengers’ perspective. A set of expected reliability buffer time measures is developed for operators by using different spatial-temporal levels combinations of RBTs. The average and the latest trip duration measures are proposed for passengers that can be used to choose a service mode and determine the departure time. Using empirical data from the automatic vehicle location system in Brisbane, Australia, the existence of mixture service states is verified and the advantage of mixture distribution model in fitting travel time profile is demonstrated. Numerical experiments validate that the proposed reliability measure is capable of quantifying service reliability consistently, while the conventional ones may provide inconsistent results. Potential applications for operators and passengers are also illustrated, including reliability improvement and trip planning.

  4. Simple Approaches to Improve the Automatic Inventory of ZEBRA Crossing from Mls Data

    Science.gov (United States)

    Arias, P.; Riveiro, B.; Soilán, M.; Díaz-Vilariño, L.; Martínez-Sánchez, J.

    2015-08-01

    The city management is increasingly supported by information technologies, leading to paradigms such as smart cities, where decision-makers, companies and citizens are continuously interconnected. 3D modelling turns of great relevance when the city has to be managed making use of geospatial databases or Geographic Information Systems. On the other hand, laser scanning technology has experienced a significant growth in the last years, and particularly, terrestrial mobile laser scanning platforms are being more and more used with inventory purposes in both cities and road environments. Consequently, large datasets are available to produce the geometric basis for the city model; however, this data is not directly exploitable by management systems constraining the implementation of the technology for such applications. This paper presents a new algorithm for the automatic detection of zebra crossing. The algorithm is divided in three main steps: road segmentation (based on a PCA analysis of the points contained in each cycle of collected by a mobile laser system), rasterization (conversion of the point cloud to a raster image coloured as a function of intensity data), and zebra crossing detection (using the Hough Transform and logical constrains for line classification). After evaluating different datasets collected in three cities located in Northwest Spain (comprising 25 strips with 30 visible zebra crossings) a completeness of 83% was achieved.

  5. SABER-School Finance: Data Collection Instrument

    Science.gov (United States)

    King, Elizabeth; Patrinos, Harry; Rogers, Halsey

    2015-01-01

    The aim of the SABER-school finance initiative is to collect, analyze and disseminate comparable data about education finance systems across countries. SABER-school finance assesses education finance systems along six policy goals: (i) ensuring basic conditions for learning; (ii) monitoring learning conditions and outcomes; (iii) overseeing…

  6. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays – all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  7. Spectral Curve Fitting for Automatic Hyperspectral Data Analysis

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    Automatic discovery and curve fitting of absorption bands in hyperspectral data can enable the analyst to identify materials present in a scene by comparison with library spectra. This procedure is common in laboratory spectra, but is challenging for sparse hyperspectral data. A procedure for robust discovery of overlapping bands in hyperspectral data is described in this paper. The method is capable of automatically discovering and fitting symmetric absorption bands, can separate overlapping absorption bands in a stable manner, and has relatively low sensitivity to noise. A comparison with techniques already available in the literature is presented using simulated spectra. An application is demonstrated utilizing the shortwave infrared (2.0-2.5 micron or 5000-4000 cm-1) region. A small hyperspectral scene is processed to demonstrate the ability of the method to detect small shifts in absorption wavelength caused by varying white mica chemistry in a natural setting.

  8. Review of developments in electronic, clinical data collection and documentation systems over the last decade – Are we ready for Big Data in routine health care?

    Directory of Open Access Journals (Sweden)

    Kerstin Anne Kessel

    2016-03-01

    Full Text Available Recently, information availability has become more elaborate and wide spread, and treatment decisions are based on a multitude of factors including imaging, molecular or pathological markers, surgical results and patient’s preference. In this context the term Big Data evolved also in health care. The hype is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only a heterogeneous and voluminous amount of data must be evaluated, it is also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the three V’s: volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or post-processing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important and economically viable field of application.

  9. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade - Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A; Combs, Stephanie E

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient's preference. In this context, the term "Big Data" evolved also in health care. The "hype" is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the "three V's": volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application.

  10. A General Method for Module Automatic Testing in Avionics Systems

    Directory of Open Access Journals (Sweden)

    Li Ma

    2013-05-01

    Full Text Available The traditional Automatic Test Equipment (ATE systems are insufficient to cope with the challenges of testing more and more complex avionics systems. In this study, we propose a general method for module automatic testing in the avionics test platform based on PXI bus. We apply virtual instrument technology to realize the automatic testing and the fault reporting of signal performance. Taking the avionics bus ARINC429 as an example, we introduce the architecture of automatic test system as well as the implementation of algorithms in Lab VIEW. The comprehensive experiments show the proposed method can effectively accomplish the automatic testing and fault reporting of signal performance. It greatly improves the generality and reliability of ATE in avionics systems.

  11. Monitoring, analysis and classification of vegetation and soil data collected by a small and lightweight hyperspectral imaging system

    Science.gov (United States)

    Mönnig, Carsten

    2014-05-01

    The increasing precision of modern farming systems requires a near-real-time monitoring of agricultural crops in order to estimate soil condition, plant health and potential crop yield. For large sized agricultural plots, satellite imagery or aerial surveys can be used at considerable costs and possible time delays of days or even weeks. However, for small to medium sized plots, these monitoring approaches are cost-prohibitive and difficult to assess. Therefore, we propose within the INTERREG IV A-Project SMART INSPECTORS (Smart Aerial Test Rigs with Infrared Spectrometers and Radar), a cost effective, comparably simple approach to support farmers with a small and lightweight hyperspectral imaging system to collect remotely sensed data in spectral bands in between 400 to 1700nm. SMART INSPECTORS includes the whole remote sensing processing chain of small scale remote sensing from sensor construction, data processing and ground truthing for analysis of the results. The sensors are mounted on a remotely controlled (RC) Octocopter, a fixed wing RC airplane as well as on a two-seated Autogyro for larger plots. The high resolution images up to 5cm on the ground include spectra of visible light, near and thermal infrared as well as hyperspectral imagery. The data will be analyzed using remote sensing software and a Geographic Information System (GIS). The soil condition analysis includes soil humidity, temperature and roughness. Furthermore, a radar sensor is envisaged for the detection of geomorphologic, drainage and soil-plant roughness investigation. Plant health control includes drought stress, vegetation health, pest control, growth condition and canopy temperature. Different vegetation and soil indices will help to determine and understand soil conditions and plant traits. Additional investigation might include crop yield estimation of certain crops like apples, strawberries, pasture land, etc. The quality of remotely sensed vegetation data will be tested with

  12. Development and field-testing of the BENTO box: A new satellite-linked data collection system for volcano monitoring

    Science.gov (United States)

    Roman, D. C.; Behar, A.; Elkins-Tanton, L. T.; Fouch, M. J.

    2013-12-01

    Currently it is impossible to monitor all of Earth's hazardous volcanoes for precursory eruption signals, and it is particularly difficult to monitor volcanoes in remote regions. The primary constraint is the high cost of deploying monitoring instrumentation (e.g., seismometers, gas sensors), which includes the cost of reliable, high-resolution sensors, the cost of maintenance (including periodic travel to remote areas), and the cost/difficulty of developing remote data telemetry. We are developing an integrated monitoring system, the BENTO (Behar's ENvironmental Telemetry and Observation) box that will allow identification of restless volcanoes through widespread deployment of robust, lightweight, low-cost, easily deployable monitoring/telemetry systems. Ultimately, we expect that this strategy will lead to more efficient allocation of instrumentation and associated costs. BENTO boxes are portable, autonomous, self-contained data collection systems are designed for long-term operation (up to ~12 months) in remote environments. They use low-cost two-way communication through the commercial Iridium satellite network, and, depending on data types, can pre-process raw data onboard to obtain useful summary statistics for transmission through Iridium. BENTO boxes also have the ability to receive commands through Iridium, allowing, for example, remote adjustment of sampling rates, or requests for segments of raw data in cases where only summary statistics are routinely transmitted. Currently, BENTO boxes can measure weather parameters (e.g., windspeed, wind direction, rainfall, humidity, atmospheric pressure), volcanic gas (CO2, SO2, and halogens) concentrations, and seismicity. In the future, we plan to interface BENTO boxes with additional sensors such as atmospheric pressure/infrasound, tilt, GPS and temperature. We are currently field-testing 'BENTO 1' boxes equipped with gas and meteorological sensors ('BENTO 1') at Telica Volcano, Nicaragua; Kilauea Volcano, Hawai

  13. Automatization of hardware configuration for plasma diagnostic system

    Science.gov (United States)

    Wojenski, A.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R. D.; Zabolotny, W.; Linczuk, P.; Chernyshova, M.; Czarski, T.; Malinowski, K.

    2016-09-01

    Soft X-ray plasma measurement systems are mostly multi-channel, high performance systems. In case of the modular construction it is necessary to perform sophisticated system discovery in parallel with automatic system configuration. In the paper the structure of the modular system designed for tokamak plasma soft X-ray measurements is described. The concept of the system discovery and further automatic configuration is also presented. FCS application (FMC/ FPGA Configuration Software) is used for running sophisticated system setup with automatic verification of proper configuration. In order to provide flexibility of further system configurations (e.g. user setup), common communication interface is also described. The approach presented here is related to the automatic system firmware building presented in previous papers. Modular construction and multichannel measurements are key requirement in term of SXR diagnostics with use of GEM detectors.

  14. Automatic reference level control for an antenna pattern recording system

    Science.gov (United States)

    Lipin, R., Jr.

    1971-01-01

    Automatic gain control system keeps recorder reference levels within 0.2 decibels during operation. System reduces recorder drift during antenna radiation distribution determinations over an eight hour period.

  15. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade – Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A.; Combs, Stephanie E.

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient’s preference. In this context, the term “Big Data” evolved also in health care. The “hype” is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data – the “three V’s”: volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application. PMID:27066456

  16. Requirements to a Norwegian national automatic gamma monitoring system

    DEFF Research Database (Denmark)

    Lauritzen, B.; Jensen, Per Hedemann; Nielsen, F.

    2005-01-01

    An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of thesystem in terms of ambient dose equivalent rate...... large distances using historical weather data; the minimum density is estimated from the requirement that a radioactive plume may not slip unnoticed inbetween stations of the monitoring network. The sensitivity of the gamma monitoring system is obtained from the condition that events that may require...

  17. Automatic Discovery of Non-Compositional Compounds in Parallel Data

    CERN Document Server

    Melamed, I D

    1997-01-01

    Automatic segmentation of text into minimal content-bearing units is an unsolved problem even for languages like English. Spaces between words offer an easy first approximation, but this approximation is not good enough for machine translation (MT), where many word sequences are not translated word-for-word. This paper presents an efficient automatic method for discovering sequences of words that are translated as a unit. The method proceeds by comparing pairs of statistical translation models induced from parallel texts in two languages. It can discover hundreds of non-compositional compounds on each iteration, and constructs longer compounds out of shorter ones. Objective evaluation on a simple machine translation task has shown the method's potential to improve the quality of MT output. The method makes few assumptions about the data, so it can be applied to parallel data other than parallel texts, such as word spellings and pronunciations.

  18. Automatic removal of outliers in hydrologic time series and quality control of rainfall data: processing a real-time database of the Local System for Flood Monitoring in Klodzko County, Poland

    Science.gov (United States)

    Mizinski, Bartlomiej; Niedzielski, Tomasz; Kryza, Maciej; Szymanowski, Mariusz

    2013-04-01

    Real-time hydrological forecasting requires the highest quality of both hydrologic and meteorological data collected in a given river basin. Large outliers may lead to inaccurate predictions, with substantial departures between observations and prognoses considered even in short term. Although we need the correctness of both riverflow and rainfall data, they cannot be processed in the same way to produce a filtered output. Indeed, hydrologic time series at a given gauge can be interpolated in time domain after having detected suspicious values, however if no outlier has been detected at the upstream sites. In the case of rainfall data, interpolation is not suitable as we cannot verify the potential outliers at a given site against data from other sites especially in the complex terrain. This is due to the fact that very local convective events may occur, leading to large rainfall peaks at a limited space. Hence, instead of interpolating data, we rather perform a flagging procedure that only ranks outliers according to the likelihood of occurrence. Following the aforementioned assumptions, we have developed a few modules that serve a purpose of a fully automated correction of a database that is updated in real-time every 15 minutes, and the main objective of the work was to produce a high-quality database for a purpose of hydrologic rainfall-runoff modeling and ensemble prediction. The database in question is available courtesy of the County Office in Kłodzko (SW Poland), the institution which owns and maintains the Local System for Flood Monitoring in Kłodzko County. The dedicated prediction system, known as HydroProg, is now being built at the University of Wrocław (Poland). As the entire prediction system, the correction modules work automatically in real time and are developed in R language. They are plugged in to a larger IT infrastructure. Hydrologic time series, which are water levels recorded every 15 minutes at 22 gauges located in Kłodzko County, are

  19. Automatic control system design of laser interferometer

    Science.gov (United States)

    Lu, Qingjie; Li, Chunjie; Sun, Hao; Ren, Shaohua; Han, Sen

    2015-10-01

    There are a lot of shortcomings with traditional optical adjustment in interferometry, such as low accuracy, time-consuming, labor-intensive, uncontrollability, and bad repetitiveness, so we treat the problem by using wireless remote control system. Comparing to the traditional method, the effect of vibration and air turbulence will be avoided. In addition the system has some peculiarities of low cost, high reliability and easy operation etc. Furthermore, the switching between two charge coupled devices (CCDs) can be easily achieved with this wireless remote control system, which is used to collect different images. The wireless transmission is achieved by using Radio Frequency (RF) module and programming the controller, pulse width modulation (PWM) of direct current (DC) motor, real-time switching of relay and high-accuracy displacement control of FAULHABER motor are available. The results of verification test show that the control system has good stability with less than 5% packet loss rate, high control accuracy and millisecond response speed.

  20. Automatic Multimedia Creation Enriched with Dynamic Conceptual Data

    Directory of Open Access Journals (Sweden)

    Angel Martín

    2012-12-01

    Full Text Available There is a growing gap between the multimedia production and the context centric multimedia services. The main problem is the under-exploitation of the content creation design. The idea is to support dynamic content generation adapted to the user or display profile. Our work is an implementation of a web platform for automatic generation of multimedia presentations based on SMIL (Synchronized Multimedia Integration Language standard. The system is able to produce rich media with dynamic multimedia content retrieved automatically from different content databases matching the semantic context. For this purpose, we extend the standard interpretation of SMIL tags in order to accomplish a semantic translation of multimedia objects in database queries. This permits services to take benefit of production process to create customized content enhanced with real time information fed from databases. The described system has been successfully deployed to create advanced context centric weather forecasts.

  1. Automatic latency equalization in VHDL-implemented complex pipelined systems

    Science.gov (United States)

    Zabołotny, Wojciech M.

    2016-09-01

    In the pipelined data processing systems it is very important to ensure that parallel paths delay data by the same number of clock cycles. If that condition is not met, the processing blocks receive data not properly aligned in time and produce incorrect results. Manual equalization of latencies is a tedious and error-prone work. This paper presents an automatic method of latency equalization in systems described in VHDL. The proposed method uses simulation to measure latencies and verify introduced correction. The solution is portable between different simulation and synthesis tools. The method does not increase the complexity of the synthesized design comparing to the solution based on manual latency adjustment. The example implementation of the proposed methodology together with a simple design demonstrating its use is available as an open source project under BSD license.

  2. Reconstruction of the sea surface elevation from the analysis of the data collected by a wave radar system

    Science.gov (United States)

    Ludeno, Giovanni; Soldovieri, Francesco; Serafino, Francesco; Lugni, Claudio; Fucile, Fabio; Bulian, Gabriele

    2016-04-01

    X-band radar system is able to provide information about direction and intensity of the sea surface currents and dominant waves in a range of few kilometers from the observation point (up to 3 nautical miles). This capability, together with their flexibility and low cost, makes these devices useful tools for the sea monitoring either coastal or off-shore area. The data collected from wave radar system can be analyzed by using the inversion strategy presented in [1,2] to obtain the estimation of the following sea parameters: peak wave direction; peak period; peak wavelength; significant wave height; sea surface current and bathymetry. The estimation of the significant wave height represents a limitation of the wave radar system because of the radar backscatter is not directly related to the sea surface elevation. In fact, in the last period, substantial research has been carried out to estimate significant wave height from radar images either with or without calibration using in-situ measurements. In this work, we will present two alternative approaches for the reconstruction of the sea surface elevation from wave radar images. In particular, the first approach is based on the basis of an approximated version of the modulation transfer function (MTF) tuned from a series of numerical simulation, following the line of[3]. The second approach is based on the inversion of radar images using a direct regularised least square technique. Assuming a linearised model for the tilt modulation, the sea elevation has been reconstructed as a least square fitting of the radar imaging data[4]. References [1]F. Serafino, C. Lugni, and F. Soldovieri, "A novel strategy for the surface current determination from marine X-band radar data," IEEE Geosci.Remote Sens. Lett., vol. 7, no. 2, pp. 231-235, Apr. 2010. [2]Ludeno, G., Brandini, C., Lugni, C., Arturi, D., Natale, A., Soldovieri, F., Serafino, F. (2014). Remocean System for the Detection of the Reflected Waves from the Costa

  3. MAD data collection - current trends.

    Energy Technology Data Exchange (ETDEWEB)

    Dementieva, I.; Evans, G.; Joachimiak, A.; Sanishvili, R.; Walsh, M. A.

    1999-09-20

    The multi-wavelength anomalous diffraction, or MAD, method of determining protein structure is becoming routine in protein crystallography. An increase in the number of tuneable synchrotrons beamlines coupled with the widespread availability position-sensitive X-ray detectors based on charged-coupled devices and having fast readout raised MAD structure determination to a new and exciting level. Ultra-fast MAD data collection is now possible. Recognition of the value of selenium for phasing protein structures and improvement of methods for incorporating selenium into proteins in the form of selenomethionine have attracted greater interest in the MAD method. Recent developments in crystallographic software are complimenting the above advances, paving the way for rapid protein structure determination. An overview of a typical MAD experiment is described here, with emphasis on the rates and quality of data acquisition now achievable at beamlines developed at third-generation synchrotrons sources.

  4. Data collection architecture for big data - A framework for a research agenda

    NARCIS (Netherlands)

    Hofman, W.J.

    2015-01-01

    As big data is expected to contribute largely to economic growth, scalability of solutions becomes apparent for deployment by organisations. It requires automatic collection and processing of large, heterogeneous data sets of a variety of resources, dealing with various aspects like improving qualit

  5. Automatic Classification of Variable Stars in Catalogs with missing data

    CERN Document Server

    Pichara, Karim

    2013-01-01

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks, a probabilistic graphical model, that allows us to perform inference to pre- dict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilises sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model we use three catalogs with missing data (SAGE, 2MASS and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches and at what computational cost. Integrating these catalogs with missing data we find that classification of variable objects improves by few percent and by 15% for quasar detection while keeping the computational co...

  6. Generalisation and extension of a web-based data collection system for clinical studies using Java and CORBA.

    Science.gov (United States)

    Eich, H P; Ohmann, C

    1999-01-01

    Inadequate informatical support of multi-centre clinical trials lead to pure quality. In order to support a multi-centre clinical trial a data collection via WWW and Internet based on Java has been developed. In this study a generalization and extension of this prototype has been performed. The prototype has been applied to another clinical trial and a knowledge server based on C+t has been integrated via CORBA. The investigation and implementation of security aspects of web-based data collection is now under evaluation.

  7. Automatic digital photo-book making system

    Science.gov (United States)

    Wang, Wiley; Teo, Patrick; Muzzolini, Russ

    2010-02-01

    The diversity of photo products has grown more than ever before. A group of photos are not only printed individually, but also can be arranged in specific order to tell a story, such as in a photo book, a calendar or a poster collage. Similar to making a traditional scrapbook, digital photo book tools allow the user to choose a book style/theme, layouts of pages, backgrounds and the way the pictures are arranged. This process is often time consuming to users, given the number of images and the choices of layout/background combinations. In this paper, we developed a system to automatically generate photo books with only a few initial selections required. The system utilizes time stamps, color indices, orientations and other image properties to best fit pictures into a final photo book. The common way of telling a story is to lay the pictures out in chronological order. If the pictures are proximate in time, they will coincide with each other and are often logically related. The pictures are naturally clustered along a time line. Breaks between clusters can be used as a guide to separate pages or spreads, thus, pictures that are logically related can stay close on the same page or spread. When people are making a photo book, it is helpful to start with chronologically grouped images, but time alone wont be enough to complete the process. Each page is limited by the number of layouts available. Many aesthetic rules also apply, such as, emphasis of preferred pictures, consistency of local image density throughout the whole book, matching a background to the content of the images, and the variety of adjacent page layouts. We developed an algorithm to group images onto pages under the constraints of aesthetic rules. We also apply content analysis based on the color and blurriness of each picture, to match backgrounds and to adjust page layouts. Some of our aesthetic rules are fixed and given by designers. Other aesthetic rules are statistic models trained by using

  8. Development of a System for Automatic Facial Expression Analysis

    Science.gov (United States)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  9. 15 CFR 990.43 - Data collection.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Data collection. 990.43 Section 990.43... DAMAGE ASSESSMENTS Preassessment Phase § 990.43 Data collection. Trustees may conduct data collection and analyses that are reasonably related to Preassessment Phase activities. Data collection and analysis...

  10. Guidelines for Automatic Data Processing Physical Security and Risk Management. Federal Information Processing Standards Publication 31.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC.

    These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…

  11. Automatic estimation of excavation volume from laser mobile mapping data for mountain road widening

    NARCIS (Netherlands)

    Wang, J.; González-Jorge, H.; Lindenbergh, R.; Arias-Sánchez, P.; Menenti, M.

    2013-01-01

    Roads play an indispensable role as part of the infrastructure of society. In recent years, society has witnessed the rapid development of laser mobile mapping systems (LMMS) which, at high measurement rates, acquire dense and accurate point cloud data. This paper presents a way to automatically est

  12. 汽车燃油加热器数据自动采集系统%Automatic Data Acquisition System for Automotive Fuel Oil Heater

    Institute of Scientific and Technical Information of China (English)

    张铁壁; 孙士尉; 夏国明; 马晓辉; 张学军

    2013-01-01

    为了解决目前汽车燃油加热器采集系统存在的问题,研制了一种基于RS-485总线的汽车燃油加热器数据采集系统.该系统采用触摸屏输入员工的信息、产品编号以及进行各项参数的设定;随后,采集模块将加热器的各项数据输入到PLC,并采用最小二乘法对温度测量数据进行修正.运行效果表明,系统操作简单、数据准确、适用性强,具有较高的推广价值.%In order to solve the problem that existing in current data acquisition system for automotive fuel oil heater,the data acquisition system based on RS-485 for automotive fuel oil heater has been researched and developed.In this system,the personnel information,product serial number and various parameters are input and setup by using touch screen ; all the data of the heater are input to PLC through acquisition module later; and the measurement data of temperature are corrected with the least squares method.The operation results prove that the system offers easy operation,accurate data and good applicability; it possesses higher promoting value.

  13. Automatic stabilization of velocity for ultrasonic vibration system

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Describes the structure of a current feedback ultrasonicgeneration system with such characteristic as velocity stabilization and automatic frequency tracking, discusses the velocity stabilization principle, and points out that successful frequency tracking is precondition for velocity stabilization.

  14. The Diagnostic System of A – 604 Automatic Transmission

    Directory of Open Access Journals (Sweden)

    Czaban Jaroslaw

    2014-09-01

    Full Text Available Automatic gearbox gains increasing popularity in Europe. Little interest in diagnosis of such type of transmission in Poland results from the fact of small share in the whole market of operated cars, so there is a lack of availability of special diagnostic devices. These factors cause issues of expensive repairs, often involving a replacement of subassembly to new or aftermarket one. To a small extent some prophylactic diagnostic tests are conducted, which can eliminate future gearbox system failures. In the paper, the proposition of diagnostic system of popular A - 604 gearbox was presented. The authors are seeking for the possibility of using such type of devices to functional elaboration of gearboxes after renovation. The built system pursues the drive of the researched object, connected with simulated load, where special controller, replacing the original one, is responsible for controlling gearbox operation. This way is used to evaluate the mechanic and hydraulic parts' state. Analysis of signal runs, registered during measurements lets conclude about operation correctness, where as comparison with stock data verifies the technical state of an automatic gearbox.

  15. ClinData Express--a metadata driven clinical research data management system for secondary use of clinical data.

    Science.gov (United States)

    Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei

    2012-01-01

    Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress.

  16. A formal structure for advanced automatic flight-control systems

    Science.gov (United States)

    Meyer, G.; Cicolani, L. S.

    1975-01-01

    Techniques were developed for the unified design of multimode, variable authority automatic flight-control systems for powered-lift STOL and VTOL aircraft. A structure for such systems is developed to deal with the strong nonlinearities inherent in this class of aircraft, to admit automatic coupling with advanced air traffic control, and to admit a variety of active control tasks. The aircraft being considered is the augmentor wing jet STOL research aircraft.

  17. Automatic respiration monitoring system; Shushin jotai no jido monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This is a system to recognize automatically status of a person in sleep including respiration stop, toss about in bed, and departure from the bed by performing animated image processing on images of the person in sleep as photographed by a camera, and by obtaining respiration waveforms from changes in the images of the breast. The system has been developed jointly by the Medical Department of Ehime University and Toshiba Engineering Company when commissioned from the Silver Service Promotion Association as a two-year project. The system requires no operation by an operator, can monitor the respiration during sleep on a real time basis from a completely non-restraint condition, and can be utilized for early discovery of crib death and/or apneic syndrome of aged persons and infants. Its effectiveness was verified by the field tests at a special facility for physically and mentally handicapped aged persons. The system was awarded with the first grand prize for an image recognition system from the Japan Automatic Recognition System Association. (translated by NEDO)

  18. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand’s Official Statistics System

    Directory of Open Access Journals (Sweden)

    Frank Pega

    2013-01-01

    Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.

  19. Longline Observer Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — LODS, the Hawaii Longline Observer Data System, is a complete suite of tools designed to collect, process, and manage quality fisheries data and information. Guided...

  20. Approximate Sensory Data Collection: A Survey

    Science.gov (United States)

    Cheng, Siyao; Cai, Zhipeng; Li, Jianzhong

    2017-01-01

    With the rapid development of the Internet of Things (IoTs), wireless sensor networks (WSNs) and related techniques, the amount of sensory data manifests an explosive growth. In some applications of IoTs and WSNs, the size of sensory data has already exceeded several petabytes annually, which brings too many troubles and challenges for the data collection, which is a primary operation in IoTs and WSNs. Since the exact data collection is not affordable for many WSN and IoT systems due to the limitations on bandwidth and energy, many approximate data collection algorithms have been proposed in the last decade. This survey reviews the state of the art of approximate data collection algorithms. We classify them into three categories: the model-based ones, the compressive sensing based ones, and the query-driven ones. For each category of algorithms, the advantages and disadvantages are elaborated, some challenges and unsolved problems are pointed out, and the research prospects are forecasted. PMID:28287440

  1. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  2. Adapting histogram for automatic noise data removal in building interior point cloud data

    Science.gov (United States)

    Shukor, S. A. Abdul; Rushforth, E. J.

    2015-05-01

    3D point cloud data is now preferred by researchers to generate 3D models. These models can be used throughout a variety of applications including 3D building interior models. The rise of Building Information Modeling (BIM) for Architectural, Engineering, Construction (AEC) applications has given 3D interior modelling more attention recently. To generate a 3D model representing the building interior, a laser scanner is used to collect the point cloud data. However, this data often comes with noise. This is due to several factors including the surrounding objects, lighting and specifications of the laser scanner. This paper highlights on the usage of the histogram to remove the noise data. Histograms, used in statistics and probability, are regularly being used in a number of applications like image processing, where a histogram can represent the total number of pixels in an image at each intensity level. Here, histograms represent the number of points recorded at range distance intervals in various projections. As unwanted noise data has a sparser cloud density compared to the required data and is usually situated at a notable distance from the required data, noise data will have lower frequencies in the histogram. By defining the acceptable range using the average frequency, points below this range can be removed. This research has shown that these histograms have the capabilities to remove unwanted data from 3D point cloud data representing building interiors automatically. This feature will aid the process of data preprocessing in producing an ideal 3D model from the point cloud data.

  3. A Wireless Framework for Lecturers' Attendance System with Automatic Vehicle Identification (AVI Technology

    Directory of Open Access Journals (Sweden)

    Emammer Khamis Shafter

    2015-10-01

    Full Text Available Automatic Vehicle Identification (AVI technology is one type of Radio Frequency Identification (RFID method which can be used to significantly improve the efficiency of lecturers' attendance system. It provides the capability of automatic data capture for attendance records using mobile device equipped in users’ vehicle. The intent of this article is to propose a framework for automatic lecturers' attendance system using AVI technology. The first objective of this work involves gathering of requirements for Automatic Lecturers' Attendance System and to represent them using UML diagrams. The second objective is to put forward a framework that will provide guidelines for developing the system. A prototype has also been created as a pilot project.

  4. SYSTEM FOR AUTOMATIC GENERALIZATION OF TOPOGRAPHIC MAPS

    Institute of Scientific and Technical Information of China (English)

    YAN Hao-wen; LI Zhi-lin; AI Ting-hua

    2006-01-01

    With the construction of spatial data infrastructure, automated topographic map generalization becomes an indispensable component in the community of cartography and geographic information science. This paper describes a topographic map generalization system recently developed by the authors. The system has the following characteristics: 1) taking advantage of three levels of automation, i.e. fully automated generalization, batch generalization,and interactive generalization, to undertake two types of processes, i.e. intelligent inference process and repetitive operation process in generalization; 2) making use of two kinds of sources for generalizing rule library, i.e. written specifications and cartographers' experiences, to define a six-element structure to describe the rules; 3) employing a hierarchical structure for map databases, logically and physically; 4) employing a grid indexing technique and undo/redo operation to improve database retrieval and object generalization efficiency. Two examples of topographic map generalization are given to demonstrate the system. It reveals that the system works well. In fact, this system has been used for a number of projects and it has been found that a great improvement in efficiency compared with traditional map generalization process can be achieved.

  5. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  6. Design and Implementation of Urban Planning and Mapping Results Data Automatic Generation System%城市规划测绘成果资料自动化生成系统设计与实现

    Institute of Scientific and Technical Information of China (English)

    吴凯华; 程相兵; 黄昀鹏; 谢武强

    2015-01-01

    With the development and popularization of computer technology , the informatization of surveying and mapping has become a trend today .In the light of the way for urban planning and mapping results data ,according to the actual needs of the production units ,carries on the software code Visual Studio 2013 platform based on the C#language and SQL Server 2008 database management platform ,using .NET and office components of the secondary development of other series version of microsoft office word .Design and implementation of urban planning and mapping results data auto-matic generation system .The software system can automatically generate urban planning surveying and mapping results data ,through the practical application of this unit in many aspects of engineering measuring team production ,validation of the advanced and practicability of the software .%随着计算机技术的发展和普及,信息化测绘已成为当今的一种趋势。针对城市规划测绘成果资料的整理方式,根据生产单位的实际需求,基于SQL Server 2008数据库管理平台和Visual Studio 2013平台的C#语言进行软件编码。利用.NET和office组件对Microsoft Office Word等多系列版本的二次开发,设计和实现了城市规划测绘成果资料自动化生成系统软件。该软件系统能够自动化生成城市规划测绘成果资料,通过本单位测量队工程生产多方面的实际应用,验证了该软件的先进性和实用性。

  7. 20 CFR 653.109 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... (including field checks), MSFW complaints, and monitoring activities, as directed by ETA. These data shall be collected in accordance with applicable ETA Reports and Guidance Letters. (b) Collect data on the number of... assure accurate reporting of data; (d) Collect and submit to ETA as directed by ETA, data on...

  8. 40 CFR 51.365 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Data collection. 51.365 Section 51.365....365 Data collection. Accurate data collection is essential to the management, evaluation, and enforcement of an I/M program. The program shall gather test data on individual vehicles, as well as...

  9. Automatic Identification of Critical Data Items in a Database to Mitigate the Effects of Malicious Insiders

    Science.gov (United States)

    White, Jonathan; Panda, Brajendra

    A major concern for computer system security is the threat from malicious insiders who target and abuse critical data items in the system. In this paper, we propose a solution to enable automatic identification of critical data items in a database by way of data dependency relationships. This identification of critical data items is necessary because insider threats often target mission critical data in order to accomplish malicious tasks. Unfortunately, currently available systems fail to address this problem in a comprehensive manner. It is more difficult for non-experts to identify these critical data items because of their lack of familiarity and due to the fact that data systems are constantly changing. By identifying the critical data items automatically, security engineers will be better prepared to protect what is critical to the mission of the organization and also have the ability to focus their security efforts on these critical data items. We have developed an algorithm that scans the database logs and forms a directed graph showing which items influence a large number of other items and at what frequency this influence occurs. This graph is traversed to reveal the data items which have a large influence throughout the database system by using a novel metric based formula. These items are critical to the system because if they are maliciously altered or stolen, the malicious alterations will spread throughout the system, delaying recovery and causing a much more malignant effect. As these items have significant influence, they are deemed to be critical and worthy of extra security measures. Our proposal is not intended to replace existing intrusion detection systems, but rather is intended to complement current and future technologies. Our proposal has never been performed before, and our experimental results have shown that it is very effective in revealing critical data items automatically.

  10. Automatic Generation of OWL Ontology from XML Data Source

    CERN Document Server

    Yahia, Nora; Ahmed, AbdelWahab

    2012-01-01

    The eXtensible Markup Language (XML) can be used as data exchange format in different domains. It allows different parties to exchange data by providing common understanding of the basic concepts in the domain. XML covers the syntactic level, but lacks support for reasoning. Ontology can provide a semantic representation of domain knowledge which supports efficient reasoning and expressive power. One of the most popular ontology languages is the Web Ontology Language (OWL). It can represent domain knowledge using classes, properties, axioms and instances for the use in a distributed environment such as the World Wide Web. This paper presents a new method for automatic generation of OWL ontology from XML data sources.

  11. Automatic Road Centerline Extraction from Imagery Using Road GPS Data

    Directory of Open Access Journals (Sweden)

    Chuqing Cao

    2014-09-01

    Full Text Available Road centerline extraction from imagery constitutes a key element in numerous geospatial applications, which has been addressed through a variety of approaches. However, most of the existing methods are not capable of dealing with challenges such as different road shapes, complex scenes, and variable resolutions. This paper presents a novel method for road centerline extraction from imagery in a fully automatic approach that addresses the aforementioned challenges by exploiting road GPS data. The proposed method combines road color feature with road GPS data to detect road centerline seed points. After global alignment of road GPS data, a novel road centerline extraction algorithm is developed to extract each individual road centerline in local regions. Through road connection, road centerline network is generated as the final output. Extensive experiments demonstrate that our proposed method can rapidly and accurately extract road centerline from remotely sensed imagery.

  12. Building Research Capacity: Results of a Feasibility Study Using a Novel mHealth Epidemiological Data Collection System Within a Gestational Diabetes Population.

    Science.gov (United States)

    McLean, Allen; Osgood, Nathaniel; Newstead-Angel, Jill; Stanley, Kevin; Knowles, Dylan; van der Kamp, William; Qian, Weicheng; Dyck, Roland

    2017-01-01

    Public health researchers have traditionally relied on individual self-reporting when collecting much epidemiological surveillance data. Data acquisition can be costly, difficult to acquire, and the data often notoriously unreliable. An interesting option for the collection of individual health (or indicators of individual health) data is the personal smartphone. Smartphones are ubiquitous, and the required infrastructure is well-developed across Canada, including many remote areas. Researchers and health professionals are asking themselves how they might exploit increasing smartphone uptake for the purposes of data collection, hopefully leading to improved individual and public health. A novel smartphone-based epidemiological data collection and analysis system has been developed by faculty and students from the CEPHIL (Computational Epidemiology and Public Health Informatics) Lab in the Department of Computer Science at the University of Saskatchewan. A pilot feasibility study was then designed to examine possible relationships between smartphone sensor data, surveys and individual clinical data within a population of pregnant women. The study focused on the development of Gestational Diabetes (GDM), a transient condition during pregnancy, but with serious potential post-birth complications for both mother and child. The researchers questioned whether real-time smartphone data could improve the clinical management and outcomes of women at risk for developing GDM, enabling earlier treatment. The initial results from this small study did not show improved prediction of GDM, but did demonstrate that real-time individual health and sensor data may be readily collected and analyzed efficiently while maintaining confidentiality. Because the original version of the data collection software could only run on Android phones, this often meant the study participants were required to carry two phones, and this often meant the study phone was not carried, and therefore data

  13. Automatized system of precipitation monitoring and recording with use of radiolocation for urban areas

    Science.gov (United States)

    Voronov, Nikolai; Dikinis, Alexandr; Ivanov, Maxim

    2016-04-01

    One of the most important lines of work in the field of increasing the efficiency of functioning of urban water disposal systems is automation of precipitation recording with application of new technological tools for measuring precipitations fallout and forecast. The developed Automatized Information System for Atmospheric Precipitation Recording (AIS «Osadki») includes a network of automatic precipitation stations on the basis of use of the precipitation gauge OTT Pluvio2; a Doppler meteorological radar; software for collection of information about precipitations and control of work of the precipitation stations network; a specialized database that provides direct access to meteorological information and statistical estimation of precipitation distribution for urban conditions. The main advantage of the System is the use of a Doppler meteorological radar which, in combination with the measurement data of the station in the automated mode with a 5-minute interval allows to estimate both the distribution of precipitations on the urban territory their intensity. As the result, it allows to drastically increase the speed of processing of hydrometeorological information and the efficiency of using it for the needs of urban services. This article was prepared within the framework of the Federal Targeted Programme for Research and Development in Priority Areas of Development of the Russian Scientific and Technological Complex for 2014-2020 (agreement № 14.574.21.0088).

  14. ATLAS Offline Data Quality System Upgrade

    CERN Document Server

    Farrell, Steve

    2012-01-01

    The ATLAS data quality software infrastructure provides tools for prompt investigation of and feedback on collected data and propagation of these results to analysis users. Both manual and automatic inputs are used in this system. In 2011, we upgraded our framework to record all issues affecting the quality of the data in a manner which allows users to extract as much information (of the data) for their particular analyses as possible. By improved recording of issues, we are allowed the ability to reassess the impact of the quality of the data on different physics measurements and adapt accordingly. We have gained significant experience with collision data operations and analysis; we have used this experience to improve the data quality system, particularly in areas of scaling and user interface. This document describes the experience gained in assessing and recording of the data quality of ATLAS and subsequent benefits to the analysis users.

  15. Remanufacturing system based on totally automatic MIG surfacing via robot

    Institute of Scientific and Technical Information of China (English)

    ZHU Sheng; GUO Ying-chun; YANG Pei

    2005-01-01

    Remanufacturing system is a term of green system project which conforms to the national sustainable development strategy. With the demand of the high adaptability of the varieties of waste machining parts, the short product cycle, the low machining cost and the high product quality are offered. Each step of the remanufacturing system from the beginning of the scanning to the accomplishment of the welding was investigted. Aiming at building a remanufacturing system based on totally automatic MIG surfacing via robot, advanced information technology, remanufacturing technology and management, through the control of the pretreatment and the optimization to minimize the time of remanufacturing and realize the remanufacturing on the terminal products of varieties, were applied. The steps mainly include: 1) using the visual sensor which is installed at the end of the Robot to rapidly get the outline data of the machining part and the pretreatment of the data; 2) rebuilding the curved surface based on the outline data and the integrated CAD material object model; 3) building the remanufacturing model based on the CAD material object model and projecting the remanufacturing process; and 4) accomplishing the remanufacture of the machining part by the technology of MIG surfacing.

  16. An automatic redesign approach for restructurable control systems

    Science.gov (United States)

    Looze, D. P.; Weiss, J. L.; Eterno, J. S.; Barrett, N. M.

    1985-01-01

    This paper presents an approach to the automatic redesign of flight control systems for aircraft that have suffered one or more control element failures. The procedure is based on Linear Quadratic design techniques, and produces a control system that maximizes a measure of feedback system performance subject to a bandwidth constraint.

  17. Automatic feed system for ultrasonic machining

    Science.gov (United States)

    Calkins, Noel C.

    1994-01-01

    Method and apparatus for ultrasonic machining in which feeding of a tool assembly holding a machining tool toward a workpiece is accomplished automatically. In ultrasonic machining, a tool located just above a workpiece and vibrating in a vertical direction imparts vertical movement to particles of abrasive material which then remove material from the workpiece. The tool does not contact the workpiece. Apparatus for moving the tool assembly vertically is provided such that it operates with a relatively small amount of friction. Adjustable counterbalance means is provided which allows the tool to be immobilized in its vertical travel. A downward force, termed overbalance force, is applied to the tool assembly. The overbalance force causes the tool to move toward the workpiece as material is removed from the workpiece.

  18. The value of data collection within a palliative care program.

    Science.gov (United States)

    Kamal, Arif H; Currow, David C; Ritchie, Christine; Bull, Janet; Wheeler, Jane L; Abernethy, Amy P

    2011-08-01

    Collecting reliable and valid data is an increasing expectation within palliative care. Data remain the crux for demonstrating value and quality of care, which are the critical steps to program sustainability. Parallel goals of conducting research and performing quality assessment and improvement can also ensure program growth, financial health, and viability in an increasingly competitive environment. Mounting expectations by patients, hospitals, and payers and inevitable pay-for-performance paradigms have transitioned data collection procedures from novel projects to expected standard operation within usual palliative care delivery. We present types of data to collect, published guides for data collection, and how data can inform quality, value, and research within a palliative care organization. Our experiences with the Quality Data Collection Tool (QDACT) in the Carolinas Palliative Care Consortium to collect data on quality have led to valuable lessons learned in creating a data collection system. Suggested steps in forming data-sharing collaborations and building data collection procedures are shared.

  19. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    Directory of Open Access Journals (Sweden)

    Wildeman Maarten A

    2011-08-01

    Full Text Available Background Data collection by Electronic Medical Record (EMR systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a Clinical Trial Data Management service (CTDMS composed of electronic Case Report Forms (eCRF can result in effective data collection and treatment monitoring. Methods Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both Primary and Secondary items, over the first five month of the trial. Results In the first five months 51 patients were entered. The Primary data error rate was 1.6%, whilst that for Secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. Conclusion The presented analysis shows that after five months since the introduction of the CTDMS the Primary and Secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols.

  20. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    Science.gov (United States)

    2011-01-01

    Background Data collection by Electronic Medical Record (EMR) systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a Clinical Trial Data Management service (CTDMS) composed of electronic Case Report Forms (eCRF) can result in effective data collection and treatment monitoring. Methods Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both Primary and Secondary items, over the first five month of the trial. Results In the first five months 51 patients were entered. The Primary data error rate was 1.6%, whilst that for Secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. Conclusion The presented analysis shows that after five months since the introduction of the CTDMS the Primary and Secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols. PMID:21824421

  1. Innovative Data Collection Strategies in Qualitative Research

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Leech, Nancy L.; Collins, Kathleen M. T.

    2010-01-01

    This article provides an innovative meta-framework comprising strategies designed to guide qualitative data collection in the 21st century. We present a meta-framework comprising strategies for collecting data from interviews, focus groups, observations, and documents/material culture. We present a template for collecting nonverbal data during…

  2. 基于自适应匹配模型的停电管理系统设计与应用%The design and application of power outage managing system based on data automatic matching model

    Institute of Scientific and Technical Information of China (English)

    熊淦辉; 黎沛坚; 徐俊林; 刘斯烟; 刘柳

    2015-01-01

    文中设计了停电管理系统。该系统通过集成计量自动化系统、营销管理系统、生产系统以及营配数据中心,实现所有专变客户、公变台区停电时间的自动采集,并采用国际标准强化分析方法和停电时间统计自适应匹配模型,实现客户停电时间统计自动化、提高客户停电时间数据统计及时性和准确性。通过在东莞供电局进行的测试运行试点和应用,表明文中设计的客户停电应用管理系统可以使客户停电时间统计准确率提高50%以上。%This thesis designsa power outage managing system. The system integratesmetering automation system、marketing management information system、production system、marketing and distribution data center tocollect power outage time of all special transformer customers and public transformer districtsautomatically . It not just realizes the automatic statistics of customer outage time but also improves effectivenessand accuracyof statistics for customer outage time throughadoptinganalysis method based oninternational standards and the automatic matching model.The trial operation and application in Dongguan power supply bureau show that power outage managing system which the thesis designs can raise the accuracy of statisticsfor customer outage timemore than 50%.

  3. Channel Access Algorithm Design for Automatic Identification System

    Institute of Scientific and Technical Information of China (English)

    Oh Sang-heon; Kim Seung-pum; Hwang Dong-hwan; Park Chan-sik; Lee Sang-jeong

    2003-01-01

    The Automatic Identification System (AIS) is a maritime equipment to allow an efficient exchange of the navigational data between ships and between ships and shore stations. It utilizes a channel access algorithm which can quickly resolve conflicts without any intervention from control stations. In this paper, a design of channel access algorithm for the AIS is presented. The input/output relationship of each access algorithm module is defined by drawing the state transition diagram, dataflow diagram and flowchart based on the technical standard, ITU-R M.1371. In order to verify the designed channel access algorithm, the simulator was developed using the C/C++ programming language. The results show that the proposed channel access algorithm can properly allocate transmission slots and meet the operational performance requirements specified by the technical standard.

  4. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  5. Truck Roll Stability Data Collection and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, SS

    2001-07-02

    The principal objective of this project was to collect and analyze vehicle and highway data that are relevant to the problem of truck rollover crashes, and in particular to the subset of rollover crashes that are caused by the driver error of entering a curve at a speed too great to allow safe completion of the turn. The data are of two sorts--vehicle dynamic performance data, and highway geometry data as revealed by vehicle behavior in normal driving. Vehicle dynamic performance data are relevant because the roll stability of a tractor trailer depends both on inherent physical characteristics of the vehicle and on the weight and distribution of the particular cargo that is being carried. Highway geometric data are relevant because the set of crashes of primary interest to this study are caused by lateral acceleration demand in a curve that exceeds the instantaneous roll stability of the vehicle. An analysis of data quality requires an evaluation of the equipment used to collect the data because the reliability and accuracy of both the equipment and the data could profoundly affect the safety of the driver and other highway users. Therefore, a concomitant objective was an evaluation of the performance of the set of data-collection equipment on the truck and trailer. The objective concerning evaluation of the equipment was accomplished, but the results were not entirely positive. Significant engineering apparently remains to be done before a reliable system can be fielded. Problems were identified with the trailer to tractor fiber optic connector used for this test. In an over-the-road environment, the communication between the trailer instrumentation and the tractor must be dependable. In addition, the computer in the truck must be able to withstand the rigors of the road. The major objective--data collection and analysis--was also accomplished. Using data collected by instruments on the truck, a ''bad-curve'' database can be generated. Using

  6. Research in Adaptronic Automatic Control System and Biosensor System Modelling

    Directory of Open Access Journals (Sweden)

    Skopis Vladimir

    2015-07-01

    Full Text Available This paper describes the research on adaptronic systems made by the author and offers to use biosensors that can be later inserted into the adaptronic systems. Adaptronic systems are based, on the one hand, on the adaptronic approach when the system is designed not to always meet the worst condition, but to change the structure of the system according to the external conditions. On the other hand, it is an extension of common automatic control ad adaptive systems. So, in the introduction firstly the adaptronic approach and biosensor as a term is explained. Adaptive systems, upon which adaptronic ones are based, are also mentioned. Then the construction of biosensor is described, as well as some information is given about the classification of biosensors and their main groups. Also it is suggested to use lichen indicators in industry to control concentration of chemical substances in the air. After that mathematical models and computer experiments for adaptronic system and biosensor analysis are given.

  7. Temporally rendered automatic cloud extraction (TRACE) system

    Science.gov (United States)

    Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.

    1999-10-01

    Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.

  8. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  9. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) from January 1, 1995 to August 1, 2011 (NCEI Accession 0052765)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 26...

  10. EBT data acquisition and analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Burris, R.D.; Greenwood, D.E.; Stanton, J.S.; Geoffroy, K.A.

    1980-10-01

    This document describes the design and implementation of a data acquisition and analysis system for the EBT fusion experiment. The system includes data acquisition on five computers, automatic transmission of that data to a large, central data base, and a powerful data retrieval system. The system is flexible and easy to use, and it provides a fully documented record of the experiments.

  11. Neuro-fuzzy system modeling based on automatic fuzzy clustering

    Institute of Scientific and Technical Information of China (English)

    Yuangang TANG; Fuchun SUN; Zengqi SUN

    2005-01-01

    A neuro-fuzzy system model based on automatic fuzzy clustering is proposed.A hybrid model identification algorithm is also developed to decide the model structure and model parameters.The algorithm mainly includes three parts:1) Automatic fuzzy C-means (AFCM),which is applied to generate fuzzy rules automatically,and then fix on the size of the neuro-fuzzy network,by which the complexity of system design is reducesd greatly at the price of the fitting capability;2) Recursive least square estimation (RLSE).It is used to update the parameters of Takagi-Sugeno model,which is employed to describe the behavior of the system;3) Gradient descent algorithm is also proposed for the fuzzy values according to the back propagation algorithm of neural network.Finally,modeling the dynamical equation of the two-link manipulator with the proposed approach is illustrated to validate the feasibility of the method.

  12. Science data collection with polarimetric SAR

    DEFF Research Database (Denmark)

    Dall, Jørgen; Woelders, Kim; Madsen, Søren Nørvang

    1996-01-01

    Discusses examples on the use of polarimetric SAR in a number of Earth science studies. The studies are presently being conducted by the Danish Center for Remote Sensing. A few studies of the European Space Agency's EMAC programme are also discussed. The Earth science objectives are presented, an......, and the potential of polarimetric SAR is discussed and illustrated with data collected by the Danish airborne EMISAR system during a number of experiments in 1994 and 1995. The presentation will include samples of data acquired for the different studies...

  13. Evaluation of the SYSTRAN Automatic Translation System. Report No. 5.

    Science.gov (United States)

    Chaumier, Jacques; And Others

    The Commission of the European Communities has acquired an automatic translation system (SYSTRAN), which has been put into operation on an experimental basis. The system covers translation of English into French and comprises a dictionary for food science and technology containing 25,000 words or inflections and 4,500 expressions. This report…

  14. Automatic Dialogue Scoring for a Second Language Learning System

    Science.gov (United States)

    Huang, Jin-Xia; Lee, Kyung-Soon; Kwon, Oh-Woog; Kim, Young-Kil

    2016-01-01

    This paper presents an automatic dialogue scoring approach for a Dialogue-Based Computer-Assisted Language Learning (DB-CALL) system, which helps users learn language via interactive conversations. The system produces overall feedback according to dialogue scoring to help the learner know which parts should be more focused on. The scoring measures…

  15. Information Collection System of Crop Growth Environment Based on the Internet of Things

    Institute of Scientific and Technical Information of China (English)

    Hua; YU; Guangyu; ZHANG; Ningbo; LU

    2013-01-01

    Based on the technology of Internet of things, for the issues of large amount data acquisition and difficult real time transport in the data acquisition of crop growth environment, this paper designs one information collection system for crop growth environment. Utilizing the range free location mechanism which defines the node position and GEAR routing mechanism give solutions to the problems of node location, routing protocol applications and so on. This system can realize accurate and automatic real time collection, aggregation and transmission of crop growth environment information, and can achieve the automation of agricultural production, to the maximum extent.

  16. Automatic Identification of Antibodies in the Protein Data Bank

    Institute of Scientific and Technical Information of China (English)

    LI Xun; WANG Renxiao

    2009-01-01

    An automatic method has been developed for identifying antibody entries in the protein data bank (PDB). Our method, called KIAb (Keyword-based Identification of Antibodies), parses PDB-format files to search for particular keywords relevant to antibodies, and makes judgment accordingly. Our method identified 780 entries as antibodies on the entire PDB. Among them, 767 entries were confirmed by manual inspection, indicating a high success rate of 98.3%. Our method recovered basically all of the entries compiled in the Summary of Antibody Crystal Structures (SACS) database. It also identified a number of entries missed by SACS. Our method thus provides a more com-plete mining of antibody entries in PDB with a very low false positive rate.

  17. Evolutionary synthesis of automatic classification on astroinformatic big data

    Science.gov (United States)

    Kojecky, Lumir; Zelinka, Ivan; Saloun, Petr

    2016-06-01

    This article describes the initial experiments using a new approach to automatic identification of Be and B[e] stars spectra in large archives. With enormous amount of these data it is no longer feasible to analyze it using classical approaches. We introduce an evolutionary synthesis of the classification by means of analytic programming, one of methods of symbolic regression. By this method, we synthesize the most suitable mathematical formulas that approximate chosen samples of the stellar spectra. As a result is then selected the category whose formula has the lowest difference compared to the particular spectrum. The results show us that classification of stellar spectra by means of analytic programming is able to identify different shapes of the spectra.

  18. Research on Geological Survey Data Management and Automatic Mapping Technology

    Directory of Open Access Journals (Sweden)

    Dong Huang

    2017-01-01

    Full Text Available The data management of a large geological survey is not an easy task. To efficiently store and manage the huge datasets, a database of geological information on the basis of Microsoft Access has been created. By using the database of geological information, we can make easily and scientifically store and manage the large geological information. The geological maps—borehole diagrams, the rose diagrams for the joint trends, and joint isointensity diagrams—are traditionally drawn by hand, which is not efficient way; next, it is not easily possible to modify. Therefore, to solve those problems, the automatic mapping method and associated interfaces have been developed by using VS2010 and geological information database; these developments are presented in this article. This article describes the theoretical basis of the new method in detail and provides a case study of practical engineering to demonstrate its application.

  19. The validity of a monitoring system based on routinely collected dairy cattle health data relative to a standardized herd check.

    Science.gov (United States)

    Brouwer, H; Stegeman, J A; Straatsma, J W; Hooijer, G A; Schaik, G van

    2015-11-01

    Dairy cattle health is often assessed during farm visits. However, farm visits are time consuming and cattle health is assessed at only one point in time. Moreover, farm visits are poorly comparable and/or repeatable when inspection is carried out by many different professionals. Many countries register cattle health parameters such as bulk milk somatic cell count (BMSCC) and mortality in central databases. A great advantage of such routinely available data is that they are uniformly gathered and registered throughout time. This makes comparison between dairy cattle herds possible and could result in opportunities to develop reliable tools for assessing cattle health based on routinely available data. In 2005, a monitoring system for the assessment of cattle health in Dutch dairy herds based on routinely available data was developed. This system had to serve as an alternative for the compulsory quarterly farm visits, which were implemented in 2002. However, before implementation of the alternative system for dairy cows, the validity of the data-based monitoring system and the compulsory quarterly visits relative to the real health status of the herd should be known. The aim of this study was to assess the validity of the data-based monitoring system and the compulsory quarterly visits relative to a standardized herd check for detecting dairy herds with health problems. The results showed that routinely available data can be used to develop an effective screening instrument for detecting herds with poor cattle health. Routinely available data such as cattle mortality and BMSCC that were used in this study had a significant association with animal-based measurements such as the general health impression of the dairy cows (including e.g. rumen fill and body condition). Our study supports the view that cattle health parameters based on routinely available data can serve as a tool for detecting herds with a poor cattle health status which can reduce the number of

  20. Collecting data in real time with postcards

    DEFF Research Database (Denmark)

    2013-01-01

    The success of information technology (IT) in transforming healthcare is often limited by the lack of clear understanding of the context at which the technology is used. Various methods have been proposed to understand healthcare context better in designing and implementing Health Information...... Systems. These methods often involve cross-sectional, retrospective data collection. This paper describes the postcard method for prospective real-time data collection, both in paper format and electronic format. This paper then describes the results obtained using postcard techniques in Denmark...... and Australia. The benefits of this technique are illustrated. There are limitations in using postcard techniques and this paper provides a detail discussion about these limitations. Postcard techniques provide unique advantages in understanding real time healthcare context and it is an important technique...

  1. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  2. A Study of Applications of Multiagent System Specificaitons and the Key Techniques in Automatic Abstracts System

    Institute of Scientific and Technical Information of China (English)

    HUShun-geng; ZHONGYi-xin

    2001-01-01

    In this thesis, multiagent system specifications, multiagent system architectures, agent communica-tion languages and agent communication protocols, automatic abstracting based on multiagent technolo-gies are studied.Some concerned problems of de-signs and realization of automatic abstracting sys-tems based on multiagent technologies are strdied, too.Chapter 1 shows the significance and objectives of the thesis, its main contents are summarized, and innovations of the thesis are showed.Some basic concepts of agents and multiagent systems are stud-ied in Chapter2.The definitions of agents and mul-tiagent systems are given, and the theory, technolo-gies and applications of multiagent systems are sum-marized .Furthermore, some important studying trends of multiagent systems are set forward.Multi-agent system specifications are strdied in Chapter30MAS/KIB-a multiagent system specification is built using mental states such as K(Know), B(Be-lief), and I(Intention), its grammar and seman-teme are discussed, axioms and inference rules are given, and some properties are researched.We also compare MAS/KIB with other existing specifica-tions.MAS/KIB has the following characteristicsL1)each agent has its own world outlood;(2)no global data in the system;(3)processes of state changes are used as indexes to systems;(4)it has the characteristics of not only time series logic but also dynamic logic;and (5) interactive actions are included.The architectures of multiagent systems are studied in Chapter 4.First, we review some typical architecture of multiagent systems, agent network architecture, agent federated architecture, agent blackboard architenture ,and Foundation of Intelligent Physical Agent(FIPA) architecture.For the first time, we set forward and study the layering and partitioning models of the architectures of multi-agent systems,organizing architecture models, and interoperability architecture model of multiagent sys-tems .Chapter 5 studies agent communication lan

  3. Automatic Extraction of Mangrove Vegetation from Optical Satellite Data

    Science.gov (United States)

    Agrawal, Mayank; Sushma Reddy, Devireddy; Prasad, Ram Chandra

    2016-06-01

    Mangrove, the intertidal halophytic vegetation, are one of the most significant and diverse ecosystem in the world. They protect the coast from sea erosion and other natural disasters like tsunami and cyclone. In view of their increased destruction and degradation in the current scenario, mapping of this vegetation is at priority. Globally researchers mapped mangrove vegetation using visual interpretation method or digital classification approaches or a combination of both (hybrid) approaches using varied spatial and spectral data sets. In the recent past techniques have been developed to extract these coastal vegetation automatically using varied algorithms. In the current study we tried to delineate mangrove vegetation using LISS III and Landsat 8 data sets for selected locations of Andaman and Nicobar islands. Towards this we made an attempt to use segmentation method, that characterize the mangrove vegetation based on their tone and the texture and the pixel based classification method, where the mangroves are identified based on their pixel values. The results obtained from the both approaches are validated using maps available for the region selected and obtained better accuracy with respect to their delineation. The main focus of this paper is simplicity of the methods and the availability of the data on which these methods are applied as these data (Landsat) are readily available for many regions. Our methods are very flexible and can be applied on any region.

  4. Automatic array alignment in data-parallel programs

    Science.gov (United States)

    Chatterjee, Siddhartha; Gilbert, John R.; Schreiber, Robert; Teng, Shang-Hua

    1993-01-01

    FORTRAN 90 and other data-parallel languages express parallelism in the form of operations on data aggregates such as arrays. Misalignment of the operands of an array operation can reduce program performance on a distributed-memory parallel machine by requiring nonlocal data accesses. Determining array alignments that reduce communication is therefore a key issue in compiling such languages. We present a framework for the automatic determination of array alignments in array-based, data-parallel languages. Our language model handles array sectioning, reductions, spreads, transpositions, and masked operations. We decompose alignment functions into three constituents: axis, stride, and offset. For each of these subproblems, we show how to solve the alignment problem for a basic block of code, possibly containing common subexpressions. Alignments are generated for all array objects in the code, both named program variables and intermediate results. We assign computation to processors by virtue of explicit alignment of all temporaries; the resulting work assignment is in general better than that provided by the 'owner-computes' rule. Finally, we present some ideas for dealing with control flow, replication, and dynamic alignments that depend on loop induction variables.

  5. Reliability of the TJ-II power supply system: Collection and analysis of the operational experience data

    Energy Technology Data Exchange (ETDEWEB)

    Izquierdo, Jesus [Fusion Energy Engineering Laboratory, Seccio d' Enginyeria Nuclear, Universitat Politecnica de Catalunya, Avda. Diagonal 647, 08028 Barcelona (Spain)], E-mail: jesus.izquierdo@upc.edu; Dies, Javier; Garcia, Jeronimo; Tapia, Carlos [Fusion Energy Engineering Laboratory, Seccio d' Enginyeria Nuclear, Universitat Politecnica de Catalunya, Avda. Diagonal 647, 08028 Barcelona (Spain); Alonso, Javier; Ascasibar, Enrique; Medrano, Mercedes; Mendez, Purificacion; Rodriguez, Lina [Asociacion EURATOM-CIEMAT para la Fusion, Avda. Complutense 22, Madrid (Spain)

    2007-10-15

    During a TJ-II pulse, the provision of magnetic fields requires a total amount of power exceeding 80 MVA. Such amount of power is supplied by a 132 MVA flywheel generator (15 kV output voltage, 80-100 Hz output frequency) and the related motor, transformers, breakers, rectifiers, regulators, protections, busbars, connections, etc. Failure data of these main components have been collected identified and processed including information on failure modes and, where possible, causes of the failures. Main statistical values about failure rates for the period from May of 1998 to December of 2004 have been calculated and are ready to be compared with those of the International Fusion Component Failure Rate Database (FCFR-DB)

  6. [Increasing effectiveness of the use of laboratory data in the therapeutic-diagnostic process through automatization].

    Science.gov (United States)

    Makarovskiĭ, V V; Shcherbatkin, D D; Nazarov, G D

    1989-01-01

    Introduction of the complex computer-aided mechanization and automatization into the laboratory process and their integration with other automated information hospital systems significantly raise efficacy of laboratory data application in treatment and diagnosis, thus reducing work losses of the medical staff. The structure of biochemical research for clinical therapeutic and surgical departments is presented along with the main biochemical diagnostic programmes for some diseases.

  7. Analysis on the Influence of Automatic Station Temperature Data on the Sequence Continuity of Historical Meteorological Data

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation peri...

  8. Midterm Report on Data Collection

    DEFF Research Database (Denmark)

    Gelsing, Lars; Linde, Lisbeth Tved

    In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started.......In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started....

  9. Design of automatic leveling and centering system of theodolite

    Science.gov (United States)

    Liu, Chun-tong; He, Zhen-Xin; Huang, Xian-xiang; Zhan, Ying

    2012-09-01

    To realize the theodolite automation and improve the azimuth Angle measurement instrument, the theodolite automatic leveling and centering system with the function of leveling error compensation is designed, which includes the system solution, key components selection, the mechanical structure of leveling and centering, and system software solution. The redesigned leveling feet are driven by the DC servo motor; and the electronic control center device is installed. Using high precision of tilt sensors as horizontal skew detection sensors ensures the effectiveness of the leveling error compensation. Aiming round mark center is located using digital image processing through surface array CCD; and leveling measurement precision can reach the pixel level, which makes the theodolite accurate centering possible. Finally, experiments are conducted using the automatic leveling and centering system of the theodolite. The results show the leveling and centering system can realize automatic operation with high centering accuracy of 0.04mm.The measurement precision of the orientation angle after leveling error compensation is improved, compared with that of in the traditional method. Automatic leveling and centering system of theodolite can satisfy the requirements of the measuring precision and its automation.

  10. Automatic Identification System modular receiver for academic purposes

    Science.gov (United States)

    Cabrera, F.; Molina, N.; Tichavska, M.; Araña, V.

    2016-07-01

    The Automatic Identification System (AIS) standard is encompassed within the Global Maritime Distress and Safety System (GMDSS), in force since 1999. The GMDSS is a set of procedures, equipment, and communication protocols designed with the aim of increasing the safety of sea crossings, facilitating navigation, and the rescue of vessels in danger. The use of this system not only is increasingly attractive to security issues but also potentially creates intelligence products throughout the added-value information that this network can transmit from ships on real time (identification, position, course, speed, dimensions, flag, among others). Within the marine electronics market, commercial receivers implement this standard and allow users to access vessel-broadcasted information if in the range of coverage. In addition to satellite services, users may request actionable information from private or public AIS terrestrial networks where real-time feed or historical data can be accessed from its nodes. This paper describes the configuration of an AIS receiver based on a modular design. This modular design facilitates the evaluation of specific modules and also a better understanding of the standard and the possibility of changing hardware modules to improve the performance of the prototype. Thus, the aim of this paper is to describe the system's specifications, its main hardware components, and to present educational didactics on the setup and use of a modular and terrestrial AIS receiver. The latter is for academic purposes and in undergraduate studies such as electrical engineering, telecommunications, and maritime studies.

  11. Automatic and Hierarchical Verification for Concurrent Systems

    Institute of Scientific and Technical Information of China (English)

    赵旭东; 冯玉琳

    1990-01-01

    Proving correctness of concurrent systems is quite difficult because of the high level of nondeterminism,especially in large and complex ones.AMC is a model checking system for verifying asynchronous concurrent systems by using branching time temporal logic.This paper introduces the techniques of the modelling approach,especially how to construct models for large concurrent systems with the concept of hierarchy,which has been proved to be effective and practical in verifying large systems without a large growth of cost.

  12. Automatic system for the determination of metals by anodic stripping potentiometry in non-deaerated samples

    OpenAIRE

    1990-01-01

    An automatic system for the determination of Zn, Cd, Pb and Cu by anodic stripping potentiometry using the oxygen dissolved in the sample as oxidant is reported. The system relies on the use of a PC-compatible computer for instrumental control and data acquisition and processing.

  13. On Learning from Collective Data

    Science.gov (United States)

    2013-12-01

    GHz AMD Opteron CPU with 64 GB RAM. We did not use the parallel implementation because it involves distributing a large amount of data and the...employment, or administration of its programs or activities on the basis of race, color, national origin, sex, handicap or disability , age, sexual

  14. Research Tips: Interview Data Collection

    Science.gov (United States)

    Griffee, Dale T.

    2005-01-01

    Interviewing is a popular way of gathering qualitative research data because it is perceived as "talking," and talking is natural. This column discusses the type of interview most often used in educational evaluation: the semistructured interview. A semistructured interview means questions are predetermined, but the interviewer is free to ask for…

  15. AUTOMATIC REGISTRATION OF MULTI-SOURCE DATA USING MUTUAL INFORMATION

    Directory of Open Access Journals (Sweden)

    E. G. Parmehr

    2012-07-01

    Full Text Available Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  16. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  17. A rapid two-dimensional data collection system for the study of ferroelectric materials under external applied electric fields.

    Science.gov (United States)

    Vergentev, Tikhon; Bronwald, Iurii; Chernyshov, Dmitry; Gorfman, Semen; Ryding, Stephanie H M; Thompson, Paul; Cernik, Robert J

    2016-10-01

    Synchrotron X-rays on the Swiss Norwegian Beamline and BM28 (XMaS) at the ESRF have been used to record the diffraction response of the PMN-PT relaxor piezoelectric 67% Pb(Mg1/3Nb2/3)O3-33% PbTiO3 as a function of externally applied electric field. A DC field in the range 0-18 kV cm(-1) was applied along the [001] pseudo-cubic direction using a specially designed sample cell for in situ single-crystal diffraction experiments. The cell allowed data to be collected on a Pilatus 2M area detector in a large volume of reciprocal space using transmission geometry. The data showed good agreement with a twinned single-phase monoclinic structure model. The results from the area detector were compared with previous Bragg peak mapping using variable electric fields and a single detector where the structural model was ambiguous. The coverage of a significantly larger section of reciprocal space facilitated by the area detector allowed precise phase analysis.

  18. A rapid two-dimensional data collection system for the study of ferroelectric materials under external applied electric fields

    Science.gov (United States)

    Vergentev, Tikhon; Bronwald, Iurii; Chernyshov, Dmitry; Gorfman, Semen; Ryding, Stephanie H. M.; Thompson, Paul; Cernik, Robert J.

    2016-01-01

    Synchrotron X-rays on the Swiss Norwegian Beamline and BM28 (XMaS) at the ESRF have been used to record the diffraction response of the PMN–PT relaxor piezoelectric 67% Pb(Mg1/3Nb2/3)O3–33% PbTiO3 as a function of externally applied electric field. A DC field in the range 0–18 kV cm−1 was applied along the [001] pseudo-cubic direction using a specially designed sample cell for in situ single-crystal diffraction experiments. The cell allowed data to be collected on a Pilatus 2M area detector in a large volume of reciprocal space using transmission geometry. The data showed good agreement with a twinned single-phase monoclinic structure model. The results from the area detector were compared with previous Bragg peak mapping using variable electric fields and a single detector where the structural model was ambiguous. The coverage of a significantly larger section of reciprocal space facilitated by the area detector allowed precise phase analysis. PMID:27738414

  19. NIDDK data repository: a central collection of clinical trial data

    Directory of Open Access Journals (Sweden)

    Hall R David

    2006-04-01

    Full Text Available Abstract Background The National Institute of Diabetes and Digestive and Kidney Diseases have established central repositories for the collection of DNA, biological samples, and clinical data to be catalogued at a single site. Here we present an overview of the site which stores the clinical data and links to biospecimens. Description The NIDDK Data repository is a web-enabled resource cataloguing clinical trial data and supporting information from NIDDK supported studies. The Data Repository allows for the co-location of multiple electronic datasets that were created as part of clinical investigations. The Data Repository does not serve the role of a Data Coordinating Center, but rather as a warehouse for the clinical findings once the trials have been completed. Because both biological and genetic samples are collected from many of the studies, a data management system for the cataloguing and retrieval of samples was developed. Conclusion The Data Repository provides a unique resource for researchers in the clinical areas supported by NIDDK. In addition to providing a warehouse of data, Data Repository staff work with the users to educate them on the datasets as well as assist them in the acquisition of multiple data sets for cross-study analysis. Unlike the majority of biological databases, the Data Repository acts both as a catalogue for data, biosamples, and genetic materials and as a central processing point for the requests for all biospecimens. Due to regulations on the use of clinical data, the ultimate release of that data is governed under NIDDK data release policies. The Data Repository serves as the conduit for such requests.

  20. Automatic control study of the icing research tunnel refrigeration system

    Science.gov (United States)

    Kieffer, Arthur W.; Soeder, Ronald H.

    1991-01-01

    The Icing Research Tunnel (IRT) at the NASA Lewis Research Center is a subsonic, closed-return atmospheric tunnel. The tunnel includes a heat exchanger and a refrigeration plant to achieve the desired air temperature and a spray system to generate the type of icing conditions that would be encountered by aircraft. At the present time, the tunnel air temperature is controlled by manual adjustment of freon refrigerant flow control valves. An upgrade of this facility calls for these control valves to be adjusted by an automatic controller. The digital computer simulation of the IRT refrigeration plant and the automatic controller that was used in the simulation are discussed.

  1. 5 CFR 890.1307 - Data collection.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Data collection. 890.1307 Section 890... Program Demonstration Project § 890.1307 Data collection. Each carrier will compile, maintain, and when requested by OPM or DoD, report data on its plan's experience necessary to produce reports containing...

  2. 24 CFR 901.100 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Data collection. 901.100 Section... PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.100 Data collection. (a) Information on some of the indicators will be derived by the State/Area Office from existing reporting and data forms. (b) A PHA...

  3. Automatic surveillance system using fish-eye lens camera

    Institute of Scientific and Technical Information of China (English)

    Xue Yuan; Yongduan Song; Xueye Wei

    2011-01-01

    This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates.Human regions are detected from the fish-eye image effectively and are corrected for perspective versions.An experiment is performed on indoor video sequences with different illumination and crowded conditions,with results demonstrating the efficiency of our algorithm.%@@ This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates. Human regions are detected from the fish-eye image effectively and are corrected for perspective versions. An experiment is performed on indoor video sequences with different illumination and crowded conditions, with results demonstrating the efficiency of our algorithm.

  4. Robust Fallback Scheme for the Danish Automatic Voltage Control System

    DEFF Research Database (Denmark)

    Qin, Nan; Dmitrova, Evgenia; Lund, Torsten;

    2015-01-01

    This paper proposes a fallback scheme for the Danish automatic voltage control system. It will be activated in case of the local station loses telecommunication to the control center and/or the local station voltage violates the acceptable operational limits. It cuts in/out switchable and tap-abl...

  5. Auditory signal design for automatic number plate recognition system

    NARCIS (Netherlands)

    Heydra, C.G.; Jansen, R.J.; Van Egmond, R.

    2014-01-01

    This paper focuses on the design of an auditory signal for the Automatic Number Plate Recognition system of Dutch national police. The auditory signal is designed to alert police officers of suspicious cars in their proximity, communicating priority level and location of the suspicious car and takin

  6. Choosing Actuators for Automatic Control Systems of Thermal Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Gorbunov, A. I., E-mail: gor@tornado.nsk.ru [JSC “Tornado Modular Systems” (Russian Federation); Serdyukov, O. V. [Siberian Branch of the Russian Academy of Sciences, Institute of Automation and Electrometry (Russian Federation)

    2015-03-15

    Two types of actuators for automatic control systems of thermal power plants are analyzed: (i) pulse-controlled actuator and (ii) analog-controlled actuator with positioning function. The actuators are compared in terms of control circuit, control accuracy, reliability, and cost.

  7. State of the art of automatic milking systems

    NARCIS (Netherlands)

    Rossing, W.; Hogewerf, P.H.

    1997-01-01

    Milking cows two or three times a day for 7 days a week is time-consuming and a heavy load for the farmer. Many high-yielding cows enter the milking parlour with heavy udders. To be able to increase the milking frequency and to decrease the physical labour requirements automatic milking systems are

  8. Design and Implementation of FAQ Automatic Return System Based on Similarity Computation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    FAQ (frequently asked question) is widely used on the Internet, but most FAQ's asking and answering are not automatic. This paper introduces the design and implementation of a FAQ automatic return system based on semantic similarity computation, including computation model choosing, FAQ characters analyzing, FAQ data formal expressing, feature vector indexing, and weight computing and so on. According to FAQ features of sentence length short, two mapping,strong domain characteristics etc. Vector Space Model with special semantic process was selected in system, and corresponding algorithm of similarity computation was proposed too. Experiment shows that the system has a good performance for high frequent and common questions.

  9. Automatic Melody Generation System with Extraction Feature

    Science.gov (United States)

    Ida, Kenichi; Kozuki, Shinichi

    In this paper, we propose the melody generation system with the analysis result of an existing melody. In addition, we introduce the device that takes user's favor in the system. The melody generation is done by pitch's being arranged best on the given rhythm. The best standard is decided by using the feature element extracted from existing music by proposed method. Moreover, user's favor is reflected in the best standard by operating some of the feature element in users. And, GA optimizes the pitch array based on the standard, and achieves the system.

  10. Requirements to a Norwegian National Automatic Gamma Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Lauritzen, B.; Hedemann Jensen, P.; Nielsen, F

    2005-04-01

    An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of the system in terms of ambient dose equivalent rate increments above the natural background levels. The study is based upon simplified deterministic calculations of the radiological consequences of generic nuclear accident scenarios. The density of gamma monitoring stations has been estimated from an analysis of the dispersion of radioactive materials over large distances using historical weather data; the minimum density is estimated from the requirement that a radioactive plume may not slip unnoticed in between stations of the monitoring network. The sensitivity of the gamma monitoring system is obtained from the condition that events that may require protective intervention measures should be detected by the system. Action levels for possible introduction of sheltering and precautionary foodstuff restrictions are derived in terms of ambient dose equivalent rate. For emergency situations where particulates contribute with only a small fraction of the total ambient dose equivalent rate from the plume, it is concluded that measurements of dose rate are sufficient to determine the need for sheltering; simple dose rate measurements however, are inadequate to determine the need for foodstuff restrictions and spectral measurements are required. (au)

  11. An Efficient Automatic Attendance System using Fingerprint Verification Technique

    OpenAIRE

    Chitresh Saraswat,; Amit Kumar

    2010-01-01

    The main aim of this paper is to develop an accurate, fast and very efficient automatic attendance system using fingerprint verification technique. We propose a system in which fingerprint verification is done by using extraction of minutiae technique and the system that automates the whole process of taking attendance, Manually which is a laborious and troublesome work and waste a lot of time, with its managing and maintaining the records for a period of time is also a burdensome task. For t...

  12. Automatic Translation of Arabic Sign to Arabic Text (ATASAT) System

    OpenAIRE

    Abdelmoty M.Ahmed; Reda Abo Alez; Muhammad Taha; Gamal Tharwat

    2016-01-01

    Sign language continues to be the preferred tool of communication between the deaf and the hearing-impaired. It is a well-structured code by h and gesture, where every gesture has a specific meaning, In this paper has goal to develop a system for automatic translation of Arabic Sign Language. To Arabic Text (ATASAT) System this system is acts as a translator among deaf and dumb with normal people to enhance their commun ication, the...

  13. The future of the perfusion record: automated data collection vs. manual recording.

    Science.gov (United States)

    Ottens, Jane; Baker, Robert A; Newland, Richard F; Mazzone, Annette

    2005-12-01

    The perfusion record, whether manually recorded or computer generated, is a legal representation of the procedure. The handwritten perfusion record has been the most common method of recording events that occur during cardiopulmonary bypass. This record is of significant contrast to the integrated data management systems available that provide continuous collection of data automatically or by means of a few keystrokes. Additionally, an increasing number of monitoring devices are available to assist in the management of patients on bypass. These devices are becoming more complex and provide more data for the perfusionist to monitor and record. Most of the data from these can be downloaded automatically into online data management systems, allowing more time for the perfusionist to concentrate on the patient while simultaneously producing a more accurate record. In this prospective report, we compared 17 cases that were recorded using both manual and electronic data collection techniques. The perfusionist in charge of the case recorded the perfusion using the manual technique while a second perfusionist entered relevant events on the electronic record generated by the Stockert S3 Data Management System/Data Bahn (Munich, Germany). Analysis of the two types of perfusion records showed significant variations in the recorded information. Areas that showed the most inconsistency included measurement of the perfusion pressures, flow, blood temperatures, cardioplegia delivery details, and the recording of events, with the electronic record superior in the integrity of the data. In addition, the limitations of the electronic system were also shown by the lack of electronic gas flow data in our hardware. Our results confirm the importance of accurate methods of recording of perfusion events. The use of an automated system provides the opportunity to minimize transcription error and bias. This study highlights the limitation of spot recording of perfusion events in the

  14. Hamster Math: Authentic Experiences in Data Collection.

    Science.gov (United States)

    Jorgensen, Beth

    1996-01-01

    Describes the data collection and interpretation project of primary grade students involving predicting, graphing, estimating, measuring, number problem construction, problem solving, and probability. (MKR)

  15. Automatic calorimetry system monitors RF power

    Science.gov (United States)

    Harness, B. W.; Heiberger, E. C.

    1969-01-01

    Calorimetry system monitors the average power dissipated in a high power RF transmitter. Sensors measure the change in temperature and the flow rate of the coolant, while a multiplier computes the power dissipated in the RF load.

  16. Automatic Tracking Evaluation and Development System (ATEDS)

    Data.gov (United States)

    Federal Laboratory Consortium — The heart of the ATEDS network consists of four SGI Octane computers running the IRIX operating system and equipped with V12 hardware graphics to support synthetic...

  17. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  18. Automatic earthquake confirmation for early warning system

    Science.gov (United States)

    Kuyuk, H. S.; Colombelli, S.; Zollo, A.; Allen, R. M.; Erdik, M. O.

    2015-07-01

    Earthquake early warning studies are shifting real-time seismology in earthquake science. They provide methods to rapidly assess earthquakes to predict damaging ground shaking. Preventing false alarms from these systems is key. Here we developed a simple, robust algorithm, Authorizing GRound shaking for Earthquake Early warning Systems (AGREEs), to reduce falsely issued alarms. This is a network threshold-based algorithm, which differs from existing approaches based on apparent velocity of P and S waves. AGREEs is designed to function as an external module to support existing earthquake early warning systems (EEWSs) and filters out the false events, by evaluating actual shaking near the epicenter. Our retrospective analyses of the 2009 L'Aquila and 2012 Emilia earthquakes show that AGREEs could help an EEWS by confirming the epicentral intensity. Furthermore, AGREEs is able to effectively identify three false events due to a storm, a teleseismic earthquake, and broken sensors in Irpinia Seismic Network, Italy.

  19. AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA

    Directory of Open Access Journals (Sweden)

    Veena G.S

    2013-12-01

    Full Text Available The proposed work aims to create a smart application camera, with the intention of eliminating the need for a human presence to detect any unwanted sinister activities, such as theft in this case. Spread among the campus, are certain valuable biometric identification systems at arbitrary locations. The application monitosr these systems (hereafter referred to as “object” using our smart camera system based on an OpenCV platform. By using OpenCV Haar Training, employing the Viola-Jones algorithm implementation in OpenCV, we teach the machine to identify the object in environmental conditions. An added feature of face recognition is based on Principal Component Analysis (PCA to generate Eigen Faces and the test images are verified by using distance based algorithm against the eigenfaces, like Euclidean distance algorithm or Mahalanobis Algorithm. If the object is misplaced, or an unauthorized user is in the extreme vicinity of the object, an alarm signal is raised.

  20. Semi-automatic aircraft control system

    Science.gov (United States)

    Gilson, Richard D. (Inventor)

    1978-01-01

    A flight control type system which provides a tactile readout to the hand of a pilot for directing elevator control during both approach to flare-out and departure maneuvers. For altitudes above flare-out, the system sums the instantaneous coefficient of lift signals of a lift transducer with a generated signal representing ideal coefficient of lift for approach to flare-out, i.e., a value of about 30% below stall. Error signals resulting from the summation are read out by the noted tactile device. Below flare altitude, an altitude responsive variation is summed with the signal representing ideal coefficient of lift to provide error signal readout.

  1. Automatic land vehicle navigation using road map data

    Energy Technology Data Exchange (ETDEWEB)

    Schindwolf, R.

    1984-06-01

    A land navigation system has been developed that provides accurate navigation data while it is traveling on mapped roads. The system is autonomous and consists of a simple dead-reckoning navigator that is updated with stored road map data. Simulation and preliminary test results indicate that accuracies on the order of 50 feet can be achieved. Accuracy is independent of time.

  2. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    Science.gov (United States)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  3. Automatic speed management systems : great safety potential ?

    NARCIS (Netherlands)

    Oei, H.-l.

    1992-01-01

    An account is given of speed management experiments carried out in The Netherlands on four 2-lane rural roads with a speed limit of 80 km/h. The experiment involved an information campaign, warning signs and a radar camera system. Fixed signs advised a speed of between 60 and 80 km/h and an automati

  4. Image Control In Automatic Welding Vision System

    Science.gov (United States)

    Richardson, Richard W.

    1988-01-01

    Orientation and brightness varied to suit welding conditions. Commands from vision-system computer drive servomotors on iris and Dove prism, providing proper light level and image orientation. Optical-fiber bundle carries view of weld area as viewed along axis of welding electrode. Image processing described in companion article, "Processing Welding Images for Robot Control" (MFS-26036).

  5. 48 CFR 26.303 - Data collection and reporting requirements.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Data collection and reporting requirements. 26.303 Section 26.303 Federal Acquisition Regulations System FEDERAL ACQUISITION... and Minority Institutions 26.303 Data collection and reporting requirements. Executive Order...

  6. NW-MILO Acoustic Data Collection

    Energy Technology Data Exchange (ETDEWEB)

    Matzner, Shari; Myers, Joshua R.; Maxwell, Adam R.; Jones, Mark E.

    2010-02-17

    There is an enduring requirement to improve our ability to detect potential threats and discriminate these from the legitimate commercial and recreational activity ongoing in the nearshore/littoral portion of the maritime domain. The Northwest Maritime Information and Littoral Operations (NW-MILO) Program at PNNL’s Coastal Security Institute in Sequim, Washington is establishing a methodology to detect and classify these threats - in part through developing a better understanding of acoustic signatures in a near-shore environment. The purpose of the acoustic data collection described here is to investigate the acoustic signatures of small vessels. The data is being recorded continuously, 24 hours a day, along with radar track data and imagery. The recording began in August 2008, and to date the data contains tens of thousands of signals from small vessels recorded in a variety of environmental conditions. The quantity and variety of this data collection, with the supporting imagery and radar track data, makes it particularly useful for the development of robust acoustic signature models and advanced algorithms for signal classification and information extraction. The underwater acoustic sensing system is part of a multi-modal sensing system that is operating near the mouth of Sequim Bay. Sequim Bay opens onto the Straight of Juan de Fuca, which contains part of the border between the U.S. and Canada. Table 1 lists the specific components used for the NW-MILO system. The acoustic sensor is a hydrophone permanently deployed at a mean depth of about 3 meters. In addition to a hydrophone, the other sensors in the system are a marine radar, an electro-optical (EO) camera and an infra-red (IR) camera. The radar is integrated with a vessel tracking system (VTS) that provides position, speed and heading information. The data from all the sensors is recorded and saved to a central server. The data has been validated in terms of its usability for characterizing the

  7. Automatic vehicle counting system for traffic monitoring

    Science.gov (United States)

    Crouzil, Alain; Khoudour, Louahdi; Valiere, Paul; Truong Cong, Dung Nghy

    2016-09-01

    The article is dedicated to the presentation of a vision-based system for road vehicle counting and classification. The system is able to achieve counting with a very good accuracy even in difficult scenarios linked to occlusions and/or presence of shadows. The principle of the system is to use already installed cameras in road networks without any additional calibration procedure. We propose a robust segmentation algorithm that detects foreground pixels corresponding to moving vehicles. First, the approach models each pixel of the background with an adaptive Gaussian distribution. This model is coupled with a motion detection procedure, which allows correctly location of moving vehicles in space and time. The nature of trials carried out, including peak periods and various vehicle types, leads to an increase of occlusions between cars and between cars and trucks. A specific method for severe occlusion detection, based on the notion of solidity, has been carried out and tested. Furthermore, the method developed in this work is capable of managing shadows with high resolution. The related algorithm has been tested and compared to a classical method. Experimental results based on four large datasets show that our method can count and classify vehicles in real time with a high level of performance (>98%) under different environmental situations, thus performing better than the conventional inductive loop detectors.

  8. Automatic outdoor monitoring system for photovoltaic panels.

    Science.gov (United States)

    Stefancich, Marco; Simpson, Lin; Chiesa, Matteo

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  9. Automatic outdoor monitoring system for photovoltaic panels

    Science.gov (United States)

    Stefancich, Marco; Simpson, Lin; Chiesa, Matteo

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  10. The anemodata 1-IIE. Automatic system for wind data acquisition; El anemodata 1-IIE. Sistema automatico para la adquisicion de datos de viento

    Energy Technology Data Exchange (ETDEWEB)

    Borja, Marco Antonio; Parkman Cuellar, Pablo A. [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)

    1986-12-31

    Wind is an inexhaustible energy source. To study its behavior in order to develop research projects and apply new technologies connected to its maximum development is one of the activities carried on at the Instituto de Investigaciones Electricas (IIE). As a part of such activities, the equipment Anemodata-1-IIE was designed and built for the wind velocity and direction data acquisition. The Anemodata-1-IIE is the result of the work that the Departamento de Fuentes no Convencionales (Non-Conventional Energy Sources of the Energy Sources Department) carries on regarding the development of electric equipment for the anemometry. [Espanol] Una fuente inagotable de energia es el viento. Estudiar su comportamiento para desarrollar proyectos de investigacion y aplicar nuevas tecnologias vinculadas con su maximo aprovechamiento es una de las actividades que se realizan en el Instituto de Investigaciones Electricas (IIE). Como parte de dichas actividades, se diseno y construyo el equipo Anemodata-1-IIE para la adquisicion de datos de velocidad y direccion del viento. El anemodata-1-IIE es un resultado de los trabajos que el Departamento de Fuentes no Convencionales, de la division de Fuentes de Energia, lleva a cabo en torno al desarrollo de equipo electrico para anemometria.

  11. Automatic computer-aided system of simulating solder joint formation

    Science.gov (United States)

    Zhao, Xiujuan; Wang, Chunqing; Zheng, Guanqun; Wang, Gouzhong; Yang, Shiqin

    1999-08-01

    One critical aspect in electronic packaging is the fatigue/creep-induced failure in solder interconnections, which is found to be highly dependent on the shape of solder joints. Thus predicting and analyzing the solder joint shape is warranted. In this paper, an automatic computer-aided system is developed to simulate the formation of solder joint and analyze the influence of the different process parameters on the solder joint shape. The developed system is capable of visually designing the process parameters and calculating the solder joint shape automatically without any intervention from the user. The automation achieved will enable fast shape estimation with the variation of process parameters without time consuming experiments, and the simulating system provides the design and manufacturing engineers an efficient software tools to design soldering process in design environment. Moreover, a program developed from the system can serve as the preprocessor for subsequent finite element joint analysis program.

  12. A Reference Implementation of a Generic Automatic Pointing System

    Science.gov (United States)

    Staig, T.; Tobar, R.; Araya, M. A.; Guajardo, C.; von Brand, H. H.

    2009-09-01

    The correction of every existent observation error is impossible. Nevertheless, the approach taken to do this should be the best possible one. Regardless of the fact that there are a huge number of problems to solve, if one knows how much they affect the observation for given conditions then it would be possible to observe as desired by counteracting these deviations. Automatic pointing adjustments help us to do this by providing mathematical support to model the perturbations, and therefore the deviations. This paper presents a generic open-sourced pointing system developed by the ALMA-UTFSM team, intended to work with the gTCS. This pointing system includes several physical terms, terms with spherical harmonics and user-customised terms which allow the generation of pointing models in a generic way. Accurate results have been obtained with test data. Graphical support is also included in our work and helps to show the variation between experimental and theoretical values of several variables in relation to different coordinates. Thanks to its open-source characteristic, it could be easily integrated into a TCS, automating the pointing calibration process for a given telescope and allowing the interesting unseen functionality of changing the pointing model while observing.

  13. Audio watermarking technologies for automatic cue sheet generation systems

    Science.gov (United States)

    Caccia, Giuseppe; Lancini, Rosa C.; Pascarella, Annalisa; Tubaro, Stefano; Vicario, Elena

    2001-08-01

    Usually watermark is used as a way for hiding information on digital media. The watermarked information may be used to allow copyright protection or user and media identification. In this paper we propose a watermarking scheme for digital audio signals that allow automatic identification of musical pieces transmitted in TV broadcasting programs. In our application the watermark must be, obviously, imperceptible to the users, should be robust to standard TV and radio editing and have a very low complexity. This last item is essential to allow a software real-time implementation of the insertion and detection of watermarks using only a minimum amount of the computation power of a modern PC. In the proposed method the input audio sequence is subdivided in frames. For each frame a watermark spread spectrum sequence is added to the original data. A two steps filtering procedure is used to generate the watermark from a Pseudo-Noise (PN) sequence. The filters approximate respectively the threshold and the frequency masking of the Human Auditory System (HAS). In the paper we discuss first the watermark embedding system then the detection approach. The results of a large set of subjective tests are also presented to demonstrate the quality and robustness of the proposed approach.

  14. 46 CFR 112.01-10 - Automatic emergency lighting and power system.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Automatic emergency lighting and power system. 112.01-10... EMERGENCY LIGHTING AND POWER SYSTEMS Definitions of Emergency Lighting and Power Systems § 112.01-10 Automatic emergency lighting and power system. An automatic emergency lighting and power system is one...

  15. Automatic detection and segmentation of lymph nodes from CT data.

    Science.gov (United States)

    Barbu, Adrian; Suehling, Michael; Xu, Xun; Liu, David; Zhou, S Kevin; Comaniciu, Dorin

    2012-02-01

    Lymph nodes are assessed routinely in clinical practice and their size is followed throughout radiation or chemotherapy to monitor the effectiveness of cancer treatment. This paper presents a robust learning-based method for automatic detection and segmentation of solid lymph nodes from CT data, with the following contributions. First, it presents a learning based approach to solid lymph node detection that relies on marginal space learning to achieve great speedup with virtually no loss in accuracy. Second, it presents a computationally efficient segmentation method for solid lymph nodes (LN). Third, it introduces two new sets of features that are effective for LN detection, one that self-aligns to high gradients and another set obtained from the segmentation result. The method is evaluated for axillary LN detection on 131 volumes containing 371 LN, yielding a 83.0% detection rate with 1.0 false positive per volume. It is further evaluated for pelvic and abdominal LN detection on 54 volumes containing 569 LN, yielding a 80.0% detection rate with 3.2 false positives per volume. The running time is 5-20 s per volume for axillary areas and 15-40 s for pelvic. An added benefit of the method is the capability to detect and segment conglomerated lymph nodes.

  16. Robust parameter design for automatically controlled systems and nanostructure synthesis

    Science.gov (United States)

    Dasgupta, Tirthankar

    2007-12-01

    This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor

  17. SeaBuoySoft – an On-line Automated Windows based Ocean Wave height Data Acquisition and Analysis System for Coastal Field’s Data Collection

    Directory of Open Access Journals (Sweden)

    P.H.Tarudkar

    2014-12-01

    Full Text Available Measurement of various hydraulic parameters such as wave heights for the research and the practical purpose in the coastal fields is one of the critical and challenging but equally important criteria in the field of ocean engineering for the design and the development of hydraulic structures such as construction of sea walls, break waters, oil jetties, fisheries harbors, all other structures, and the ships maneuvering, embankments, berthing on jetties. This paper elucidates the development of “SeaBuoySoft online software system for coastal field‟s wave height data collection” for the coastal application work. The system could be installed along with the associated hardware such as a Digital Waverider Receiver unit and a Waverider Buoy at the shore. The ocean wave height data, transmitted by wave rider buoy installed in the shallow/offshore waters of sea is received by the digital waverider receiver unit and it is interfaced to the SeaBuoySoft software. The design and development of the software system has been worked out in-house at Central Water and Power Research Station, Pune, India. The software has been developed as a Windows based standalone version and is unique of its kind for the reception of real time ocean wave height data, it takes care of its local storage of wave height data for its further analysis work as and when required. The system acquires real time ocean wave height data round the clock requiring no operator intervention during data acquisition process on site.

  18. 34 CFR 303.176 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.176 Section 303.176 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... Data collection. Each application must include procedures that meet the requirements in §...

  19. The Transformed Civil Rights Data Collection (CRDC)

    Science.gov (United States)

    Office for Civil Rights, US Department of Education, 2012

    2012-01-01

    Since 1968, the Civil Rights Data Collection (CRDC) has collected data on key education and civil rights issues in our nation's public schools for use by the Department of Education's Office for Civil Rights (OCR), other Department offices, other federal agencies, and by policymakers and researchers outside of the Department. The CRDC has…

  20. An Automatic Number Plate Recognition System under Image Processing

    OpenAIRE

    Sarbjit Kaur

    2016-01-01

    Automatic Number Plate Recognition system is an application of computer vision and image processing technology that takes photograph of vehicles as input image and by extracting their number plate from whole vehicle image , it display the number plate information into text. Mainly the ANPR system consists of 4 phases: - Acquisition of Vehicle Image and Pre-Processing, Extraction of Number Plate Area, Character Segmentation and Character Recognition. The overall accuracy and efficiency of whol...

  1. Semi-automatic Story Creation System in Ubiquitous Sensor Environment

    Science.gov (United States)

    Yoshioka, Shohei; Hirano, Yasushi; Kajita, Shoji; Mase, Kenji; Maekawa, Takuya

    This paper proposes an agent system that semi-automatically creates stories about daily events detected by ubiquitous sensors and posts them to a weblog. The story flow is generated from query-answering interaction between sensor room inhabitants and a symbiotic agent. The agent questions the causal relationships among daily events to create the flow of the story. Preliminary experimental results show that the stories created by our system help users understand daily events.

  2. Methods for Using Ground-Water Model Predictions to Guide Hydrogeologic Data Collection, with Applications to the Death Valley Regional Ground-Water Flow System

    Energy Technology Data Exchange (ETDEWEB)

    Claire R. Tiedeman; M.C. Hill; F.A. D' Agnese; C.C. Faunt

    2001-07-31

    Calibrated models of ground-water systems can provide substantial information for guiding data collection. This work considers using such models to guide hydrogeologic data collection for improving model predictions, by identifying model parameters that are most important to the predictions. Identification of these important parameters can help guide collection of field data about parameter values and associated flow-system features that can lead to improved predictions. Methods for identifying parameters important to predictions include prediction scaled sensitivities (PSS), which account for uncertainty on individual parameters as well as prediction sensitivity to parameters, and a new ''value of improved information'' (VOII) method, which includes the effects of parameter correlation in addition to individual parameter uncertainty and prediction sensitivity. The PSS and VOII methods are demonstrated using a model of the Death Valley regional ground-water flow system. The predictions of interest are advective-transport paths originating at sites of past underground nuclear testing. Results show that for two paths evaluated, the most important parameters include a subset of five or six of the 23 defined model parameters. Some of the parameters identified as most important are associated with flow-system attributes that do not lie in the immediate vicinity of the paths. Results also indicate that the PSS and VOII methods can identify different important parameters. Because the methods emphasize somewhat different criteria for parameter importance, it is suggested that parameters identified by both methods be carefully considered in subsequent data collection efforts aimed at improving model predictions.

  3. A New Approach for Realistic 3D Reconstruction of Planar Surfaces from Laser Scanning Data and Imagery Collected Onboard Modern Low-Cost Aerial Mapping Systems

    Directory of Open Access Journals (Sweden)

    Zahra Lari

    2017-02-01

    Full Text Available Over the past few years, accurate 3D surface reconstruction using remotely-sensed data has been recognized as a prerequisite for different mapping, modelling, and monitoring applications. To fulfill the needs of these applications, necessary data are generally collected using various digital imaging systems. Among them, laser scanners have been acknowledged as a fast, accurate, and flexible technology for the acquisition of high density 3D spatial data. Despite their quick accessibility, the acquired 3D data using these systems does not provide semantic information about the nature of scanned surfaces. Hence, reliable processing techniques are employed to extract the required information for 3D surface reconstruction. Moreover, the extracted information from laser scanning data cannot be effectively utilized due to the lack of descriptive details. In order to provide a more realistic and accurate perception of the scanned scenes using laser scanning systems, a new approach for 3D reconstruction of planar surfaces is introduced in this paper. This approach aims to improve the interpretability of the extracted planar surfaces from laser scanning data using spectral information from overlapping imagery collected onboard modern low-cost aerial mapping systems, which are widely adopted nowadays. In this approach, the scanned planar surfaces using laser scanning systems are initially extracted through a novel segmentation procedure, and then textured using the acquired overlapping imagery. The implemented texturing technique, which intends to overcome the computational inefficiency of the previously-developed 3D reconstruction techniques, is performed in three steps. In the first step, the visibility of the extracted planar surfaces from laser scanning data within the collected images is investigated and a list of appropriate images for texturing each surface is established. Successively, an occlusion detection procedure is carried out to identify the

  4. Automatically Discovering Relaxed Lyapunov Functions for Polynomial Dynamical Systems

    CERN Document Server

    Liu, Jiang; Zhao, Hengjun

    2011-01-01

    The notion of Lyapunov function plays a key role in design and verification of dynamical systems, as well as hybrid and cyber-physical systems. In this paper, to analyze the asymptotic stability of a dynamical system, we generalize standard Lyapunov functions to relaxed Lyapunov functions (RLFs), by considering higher order Lie derivatives of certain functions along the system's vector field. Furthermore, we present a complete method to automatically discovering polynomial RLFs for polynomial dynamical systems (PDSs). Our method is complete in the sense that it is able to discover all polynomial RLFs by enumerating all polynomial templates for any PDS.

  5. Waste collection systems for recyclables

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Merrild, Hanna Kristina; Møller, Jacob

    2010-01-01

    and technical limitations are respected, and what will the environmental and economic consequences be? This was investigated in a case study of a municipal waste management system. Five scenarios with alternative collection systems for recyclables (paper, glass, metal and plastic packaging) were assessed......Recycling of paper and glass from household waste is an integrated part of waste management in Denmark, however, increased recycling is a legislative target. The questions are: how much more can the recycling rate be increased through improvements of collection schemes when organisational...... by means of a life cycle assessment and an assessment of the municipality's costs. Kerbside collection would provide the highest recycling rate, 31% compared to 25% in the baseline scenario, but bring schemes with drop-off containers would also be a reasonable solution. Collection of recyclables...

  6. The use of the Global Positioning System for real-time data collecting during ecological aerial surveys in the Kruger National Park, South Africa

    Directory of Open Access Journals (Sweden)

    P.C. Viljoen

    1994-09-01

    Full Text Available The use of the Global Positioning System (GPS for real-time data collecting during ecological aerial surveys (EAS in the Kruger National Park (KNP was investigated as an alternative to post-survey manual data capture. Results obtained during an aerial census of large herbivores and surface water distribution in the northern part of the KNP using an onboard GPS connected to a palmtop computer are discussed. This relatively inexpensive system proved to be highly efficient for real-time data capture while additional information such as ground velocity and time can be recorded for every data point. Measures of distances between a ground marker and fix points measured during a flight (x = 60.0 m are considered to be well within the requirements of the EAS.

  7. A centralised remote data collection system using automated traps for managing and controlling the population of the Mediterranean (Ceratitis capitata) and olive (Dacus oleae) fruit flies

    Science.gov (United States)

    Philimis, Panayiotis; Psimolophitis, Elias; Hadjiyiannis, Stavros; Giusti, Alessandro; Perelló, Josep; Serrat, Albert; Avila, Pedro

    2013-08-01

    The present paper describes the development of a novel monitoring system (e-FlyWatch system) for managing and controlling the population of two of the world's most destructive fruit pests, namely the olive fruit fly (Bactrocera oleae, Rossi - formerly Dacus oleae) and the Mediterranean fruit fly (Ceratitis capitata, also called medfly). The novel monitoring system consists of a) novel automated traps with optical and motion detection modules for capturing the flies, b) local stations including a GSM/GPRS module, sensors, flash memory, battery, antenna etc. and c) a central station that collects, stores and publishes the results (i.e. insect population in each field, sensor data, possible error/alarm data) via a web-based management software.The centralised data collection system provides also analysis and prediction models, end-user warning modules and historical analysis of infested areas. The e-FlyWatch system enables the SMEs-producers in the Fruit, Vegetable and Olive sectors to improve their production reduce the amount of insecticides/pesticides used and consequently the labour cost for spraying activities, and the labour cost for traps inspection.

  8. ACRF Data Collection and Processing Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, M; Egan, D

    2004-12-01

    We present a description of the data flow from measurement to long-term archive. We also discuss data communications infrastructure. The data handling processes presented include collection, transfer, ingest, quality control, creation of Value-Added Products (VAP), and data archiving.

  9. 4D measurement system for automatic location of anatomical structures

    Science.gov (United States)

    Witkowski, Marcin; Sitnik, Robert; Kujawińska, Małgorzata; Rapp, Walter; Kowalski, Marcin; Haex, Bart; Mooshake, Sven

    2006-04-01

    Orthopedics and neurosciences are fields of medicine where the analysis of objective movement parameters is extremely important for clinical diagnosis. Moreover, as there are significant differences between static and dynamic parameters, there is a strong need of analyzing the anatomical structures under functional conditions. In clinical gait analysis the benefits of kinematical methods are undoubted. In this paper we present a 4D (3D + time) measurement system capable of automatic location of selected anatomical structures by locating and tracing the structures' position and orientation in time. The presented system is designed to help a general practitioner in diagnosing selected lower limbs' dysfunctions (e.g. knee injuries) and also determine if a patient should be directed for further examination (e.g. x-ray or MRI). The measurement system components are hardware and software. For the hardware part we adapt the laser triangulation method. In this way we can evaluate functional and dynamic movements in a contact-free, non-invasive way, without the use of potentially harmful radiation. Furthermore, opposite to marker-based video-tracking systems, no preparation time is required. The software part consists of a data acquisition module, an image processing and point clouds (point cloud, set of points described by coordinates (x, y, z)) calculation module, a preliminary processing module, a feature-searching module and an external biomechanical module. The paper briefly presents the modules mentioned above with the focus on the feature-searching module. Also we present some measurement and analysis results. These include: parameters maps, landmarks trajectories in time sequence and animation of a simplified model of lower limbs.

  10. Cloud-Based Smart Health Monitoring System for Automatic Cardiovascular and Fall Risk Assessment in Hypertensive Patients.

    Science.gov (United States)

    Melillo, P; Orrico, A; Scala, P; Crispino, F; Pecchia, L

    2015-10-01

    The aim of this paper is to describe the design and the preliminary validation of a platform developed to collect and automatically analyze biomedical signals for risk assessment of vascular events and falls in hypertensive patients. This m-health platform, based on cloud computing, was designed to be flexible, extensible, and transparent, and to provide proactive remote monitoring via data-mining functionalities. A retrospective study was conducted to train and test the platform. The developed system was able to predict a future vascular event within the next 12 months with an accuracy rate of 84 % and to identify fallers with an accuracy rate of 72 %. In an ongoing prospective trial, almost all the recruited patients accepted favorably the system with a limited rate of inadherences causing data losses (<20 %). The developed platform supported clinical decision by processing tele-monitored data and providing quick and accurate risk assessment of vascular events and falls.

  11. National Ignition Facility sub-system design requirements automatic alignment system SSDR 1.5.5

    Energy Technology Data Exchange (ETDEWEB)

    VanArsdall, P.; Bliss, E.

    1996-09-01

    This System Design Requirement document establishes the performance, design, development, and test requirements for the Automatic Alignment System, which is part of the NIF Integrated Computer Control System (ICCS).

  12. Automatic Scheduling and Planning (ASAP) in future ground control systems

    Science.gov (United States)

    Matlin, Sam

    1988-01-01

    This report describes two complementary approaches to the problem of space mission planning and scheduling. The first is an Expert System or Knowledge-Based System for automatically resolving most of the activity conflicts in a candidate plan. The second is an Interactive Graphics Decision Aid to assist the operator in manually resolving the residual conflicts which are beyond the scope of the Expert System. The two system designs are consistent with future ground control station activity requirements, support activity timing constraints, resource limits and activity priority guidelines.

  13. Aircraft automatic flight control system with model inversion

    Science.gov (United States)

    Smith, G. A.; Meyer, George

    1990-01-01

    A simulator study was conducted to verify the advantages of a Newton-Raphson model-inversion technique as a design basis for an automatic trajectory control system in an aircraft with highly nonlinear characteristics. The simulation employed a detailed mathematical model of the aerodynamic and propulsion system performance characteristics of a vertical-attitude takeoff and landing tactical aircraft. The results obtained confirm satisfactory control system performance over a large portion of the flight envelope. System response to wind gusts was satisfactory for various plausible combinations of wind magnitude and direction.

  14. FULLY AUTOMATIC IMAGE-BASED REGISTRATION OF UNORGANIZED TLS DATA

    Directory of Open Access Journals (Sweden)

    M. Weinmann

    2012-09-01

    Full Text Available The estimation of the transformation parameters between different point clouds is still a crucial task as it is usually followed by scene reconstruction, object detection or object recognition. Therefore, the estimates should be as accurate as possible. Recent developments show that it is feasible to utilize both the measured range information and the reflectance information sampled as image, as 2D imagery provides additional information. In this paper, an image-based registration approach for TLS data is presented which consists of two major steps. In the first step, the order of the scans is calculated by checking the similarity of the respective reflectance images via the total number of SIFT correspondences between them. Subsequently, in the second step, for each SIFT correspondence the respective SIFT features are filtered with respect to their reliability concerning the range information and projected to 3D space. Combining the 3D points with 2D observations on a virtual plane yields 3D-to-2D correspondences from which the coarse transformation parameters can be estimated via a RANSAC-based registration scheme including the EPnP algorithm. After this coarse registration, the 3D points are again checked for consistency by using constraints based on the 3D distance, and, finally, the remaining 3D points are used for an ICP-based fine registration. Thus, the proposed methodology provides a fast, reliable, accurate and fully automatic image-based approach for the registration of unorganized point clouds without the need of a priori information about the order of the scans, the presence of regular surfaces or human interaction.

  15. Intelligent E-Learning Systems: Automatic Construction of Ontologies

    Science.gov (United States)

    Peso, Jesús del; de Arriaga, Fernando

    2008-05-01

    During the last years a new generation of Intelligent E-Learning Systems (ILS) has emerged with enhanced functionality due, mainly, to influences from Distributed Artificial Intelligence, to the use of cognitive modelling, to the extensive use of the Internet, and to new educational ideas such as the student-centered education and Knowledge Management. The automatic construction of ontologies provides means of automatically updating the knowledge bases of their respective ILS, and of increasing their interoperability and communication among them, sharing the same ontology. The paper presents a new approach, able to produce ontologies from a small number of documents such as those obtained from the Internet, without the assistance of large corpora, by using simple syntactic rules and some semantic information. The method is independent of the natural language used. The use of a multi-agent system increases the flexibility and capability of the method. Although the method can be easily improved, the results so far obtained, are promising.

  16. Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data

    Science.gov (United States)

    Hong, J. H.; Su, Y. T.

    2016-06-01

    With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  17. Automatic identification of artifacts in electrodermal activity data.

    Science.gov (United States)

    Taylor, Sara; Jaques, Natasha; Chen, Weixuan; Fedor, Szymon; Sano, Akane; Picard, Rosalind

    2015-01-01

    Recently, wearable devices have allowed for long term, ambulatory measurement of electrodermal activity (EDA). Despite the fact that ambulatory recording can be noisy, and recording artifacts can easily be mistaken for a physiological response during analysis, to date there is no automatic method for detecting artifacts. This paper describes the development of a machine learning algorithm for automatically detecting EDA artifacts, and provides an empirical evaluation of classification performance. We have encoded our results into a freely available web-based tool for artifact and peak detection.

  18. Learning Semantic Concepts from Noisy Media Collection for Automatic Image Annotation

    Institute of Scientific and Technical Information of China (English)

    TIAN Feng; SHEN Xukun

    2015-01-01

    — Along with the explosive growth of im-ages, automatic image annotation has attracted great in-terest of various research communities. However, despite the great progress achieved in the past two decades, au-tomatic annotation is still an important open problem in computer vision, and can hardly achieve satisfactory per-formance in real-world environment. In this paper, we ad-dress the problem of annotation when noise is interfering with the dataset. A semantic neighborhood learning model on noisy media collection is proposed. Missing labels are replenished, and semantic balanced neighborhood is con-struct. The model allows the integration of multiple la-bel metric learning and local nonnegative sparse coding. We construct semantic consistent neighborhood for each sample, thus corresponding neighbors have higher global similarity, partial correlation, conceptual similarity along with semantic balance. Meanwhile, an iterative denoising method is also proposed. The method proposed makes a marked improvement as compared to the current state-of-the-art.

  19. Automatic Weighing System for Kitchen Waste%餐厨垃圾自动称重系统的研究

    Institute of Scientific and Technical Information of China (English)

    席本强; 武洪岩; 谢军波

    2012-01-01

    In the current work of food waste collection and transportation, cost can't be automatically generated according to the polluters' a-mount of food waste. So an automatic system( installed in the vehicles) is design,which work through the RFID and automatic weighing modules. The core of the system is a PIC microcontroller. Data are stored in the IC through the I2C-bus card. IC card is submitted when the job finished. Then the computer will complete the statistics of pollution amount from the polluters. Test results show that the system can accurately measure food waste,and weigh automatically.%针对目前餐厨垃圾收运工作存在无法自动计算污染者的餐厨垃圾产生量的问题,设计了一种以PIC单片机为核心的餐厨垃圾自动称重系统,通过安装在收运车辆上的RFID模块和自动称重模块,对不同污染者的垃圾产生量进行自动称重.数据通过I2C总线存储在IC卡中,作业完成提交IC卡,由计算机完成对污染者污染量的统计.测试结果证明,该系统能够对餐厨垃圾进行准确测量,达到自动称重的目的.

  20. Automatic Thermal Control System with Temperature Difference or Derivation Feedback

    Directory of Open Access Journals (Sweden)

    Darina Matiskova

    2016-02-01

    Full Text Available Automatic thermal control systems seem to be non-linear systems with thermal inertias and time delay. A controller is also non-linear because its information and power signals are limited. The application of methods that are available to on-linear systems together with computer simulation and mathematical modelling creates a possibility to acquire important information about the researched system. This paper provides a new look at the heated system model and also designs the structure of the thermal system with temperature derivation feedback. The designed system was simulated by using a special software in Turbo Pascal. Time responses of this system are compared to responses of a conventional thermal system. The thermal system with temperature derivation feedback provides better transients, better quality of regulation and better dynamical properties.

  1. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  2. Automatic optical inspection system design for golf ball

    Science.gov (United States)

    Wu, Hsien-Huang; Su, Jyun-Wei; Chen, Chih-Lin

    2016-09-01

    ith the growing popularity of golf sport all over the world, the quantities of relevant products are increasing year by year. To create innovation and improvement in quality while reducing production cost, automation of manufacturing become a necessary and important issue. This paper reflect the trend of this production automa- tion. It uses the AOI (Automated Optical Inspection) technology to develop a system which can automatically detect defects on the golf ball. The current manual quality-inspection is not only error-prone but also very man- power demanding. Taking into consideration the competition of this industry in the near future, the development of related AOI equipment must be conducted as soon as possible. Due to the strong reflective property of the ball surface, as well as its surface dimples and subtle flaws, it is very difficult to take good quality image for automatic inspection. Based on the surface properties and shape of the ball, lighting has been properly design for image-taking environment and structure. Area-scan cameras have been used to acquire images with good contrast between defects and background to assure the achievement of the goal of automatic defect detection on the golf ball. The result obtained is that more than 973 of the NG balls have be detected, and system maintains less than 103 false alarm rate. The balls which are determined by the system to be NG will be inspected by human eye again. Therefore, the manpower spent in the inspection has been reduced by 903.

  3. Modeling and Prototyping of Automatic Clutch System for Light Vehicles

    Science.gov (United States)

    Murali, S.; Jothi Prakash, V. M.; Vishal, S.

    2017-03-01

    Nowadays, recycling or regenerating the waste in to something useful is appreciated all around the globe. It reduces greenhouse gas emissions that contribute to global climate change. This study deals with provision of the automatic clutch mechanism in vehicles to facilitate the smooth changing of gears. This study proposed to use the exhaust gases which are normally expelled out as a waste from the turbocharger to actuate the clutch mechanism in vehicles to facilitate the smooth changing of gears. At present, clutches are operated automatically by using an air compressor in the four wheelers. In this study, a conceptual design is proposed in which the clutch is operated by the exhaust gas from the turbocharger and this will remove the usage of air compressor in the existing system. With this system, usage of air compressor is eliminated and the riders need not to operate the clutch manually. This work involved in development, analysation and validation of the conceptual design through simulation software. Then the developed conceptual design of an automatic pneumatic clutch system is tested with proto type.

  4. 14 CFR 25.904 - Automatic takeoff thrust control system (ATTCS).

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic takeoff thrust control system... Automatic takeoff thrust control system (ATTCS). Each applicant seeking approval for installation of an engine power control system that automatically resets the power or thrust on the operating engine(s)...

  5. Automated data collection for macromolecular crystallography.

    Science.gov (United States)

    Winter, Graeme; McAuley, Katherine E

    2011-09-01

    An overview, together with some practical advice, is presented of the current status of the automation of macromolecular crystallography (MX) data collection, with a focus on MX beamlines at Diamond Light Source, UK.

  6. Observer Manual and Current Data Collection Forms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Observer Program web page that lists the observer field manual and all current data collection forms that observers are required to take out to sea.

  7. Automatic control systems satisfying certain general criterions on transient behavior

    Science.gov (United States)

    Boksenbom, Aaron S; Hood, Richard

    1952-01-01

    An analytic method for the design of automatic controls is developed that starts from certain arbitrary criterions on the behavior of the controlled system and gives those physically realizable equations that the control system can follow in order to realize this behavior. The criterions used are developed in the form of certain time integrals. General results are shown for systems of second order and of any number of degrees of freedom. Detailed examples for several cases in the control of a turbojet engine are presented.

  8. Automatic control and tracking of periodic orbits in chaotic systems.

    Science.gov (United States)

    Ando, Hiroyasu; Boccaletti, S; Aihara, Kazuyuki

    2007-06-01

    Based on an automatic feedback adjustment of an additional parameter of a dynamical system, we propose a strategy for controlling periodic orbits of desired periods in chaotic dynamics and tracking them toward the set of unstable periodic orbits embedded within the original chaotic attractor. The method does not require information on the system to be controlled, nor on any reference states for the targets, and it overcomes some of the difficulties encountered by other techniques. Assessments of the method's effectiveness and robustness are given by means of the application of the technique to the stabilization of unstable periodic orbits in both discrete- and continuous-time systems.

  9. Laplace domain automatic data assimilation of contaminant transport using a Wireless Sensor Network

    Science.gov (United States)

    Barnhart, K.; Illangasekare, T. H.

    2011-12-01

    Emerging in situ sensors and distributed network technologies have the potential to monitor dynamic hydrological and environmental processes more effectively than traditional monitoring and data acquisition techniques by sampling at greater spatial and temporal resolutions. In particular, Wireless Sensor Networks, the combination of low-power telemetry and energy-harvesting with miniaturized sensors, could play a large role in monitoring the environment on nature's time scale. Since sensor networks supply data with little or no delay, applications exist where automatic or real-time assimilation of this data would be useful, for example during smart remediation procedures where tracking of the plume response will reinforce real-time decisions. As a foray into this new data context, we consider the estimation of hydraulic conductivity when incorporating subsurface plume concentration data. Current practice optimizes the model in the time domain, which is often slow and overly sensitive to data anomalies. Instead, we perform model inversion in Laplace space and are able to do so because data gathered using new technologies can be sampled densely in time. An intermediate-scale synthetic aquifer is used to illustrate the developed technique. Data collection and model (re-)optimization are automatic. Electric conductivity values of passing sodium bromide plumes are sent through a wireless sensor network, stored in a database, scrubbed and passed to a modeling server which transforms the data and assimilates it into a Laplace domain model. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000

  10. Collective dynamics of multicellular systems

    Indian Academy of Sciences (India)

    R Maithreye; C Suguna; Somdatta Sinha

    2011-11-01

    We have studied the collective behaviour of a one-dimensional ring of cells for conditions when the individual uncoupled cells show stable, bistable and oscillatory dynamics. We show that the global dynamics of this model multicellular system depends on the system size, coupling strength and the intrinsic dynamics of the cells. The intrinsic variability in dynamics of the constituent cells are suppressed to stable dynamics, or modified to intermittency under different conditions. This simple model study reveals that cell–cell communication, system size and intrinsic cellular dynamics can lead to evolution of collective dynamics in structured multicellular biological systems that is significantly different from its constituent single-cell behaviour.

  11. Design of Management Information System for Metro Automatic Fare Collection%自动售检票系统中车站信息管理系统的研究与设计

    Institute of Scientific and Technical Information of China (English)

    徐炜炜; 徐骏善; 叶飞

    2012-01-01

    介绍了AFC(自动售检票系统)车站管理信息系统的体系结构和主要功能.该系统采用表示层、业务逻辑层和数据传输层的三层结构模型,具有票务管理、钱票箱库存管理、运营管理、系统维护、通信服务等功能.在软件开发中,采用了CORBA(公共对象请求代理体系)中间件技术,使其具有跨软、硬件平台的性能,也方便开发人员分工协作;采用了数据库技术并给出了系统的总体E-R图.%This paper introduces the structure and key features of AFC station management information system (AS-MIS), which is designed in a three-tier structure: presentation, business logic and data- transmission, possessing the functions like management of ticketing affairs, inventory of ticket/cash box, operation control, system maintenance and communication service, etc. In the design, CORBA middleware technology is adopted to ensure a good performance of the system on various software or hardware platforms, and facilitate the developers'cooperation, a general entity-relationship (E-R) Diagram is obtained by a-dopting the database technology.

  12. Automatic procedure for quasi-real time seismic data processing at Campi Flegrei (Italy)

    Science.gov (United States)

    Capuano, Paolo; Ciaramella, Angelo; De Lauro, Enza; De Martino, Salvatore; Falanga, Mariarosaria; Petrosino, Simona

    2014-05-01

    The accuracy of automatic procedures for detecting seismic events and locating their sources is influenced by several factors such as errors in picking seismic phases often buried in the high-level ambient noise, network geometry and modelling errors. fundamental objective is the improvement of these procedures by developing accurate algorithms for quasi-real time seismic data processing, easily managed in observatory practice. Recently a robust automatic procedure has been implemented for detecting, onset picking and identifying signal phases in continuous seismic signal with an application at the seismicity recorded at Campi Flegrei Caldera (Italy) during the 2006 ground uplift (Ciaramella et al. 2011). An Independent Component Analysis based approach for the Blind Source Separation of convolutive mixtures (CICA) has been adopted to obtain a clear separation of low-energy Long Period events (LPs) from the high-level ambient noise allowing to compile a complete seismic catalogue and better quantify the seismic energy release. In this work, we apply CICA at the seismic signal continuously recorded during the entire 2006 at Campi Flegrei. First, we have performed tests on synthetic data in order to improve the reliability and the accuracy of the procedure. The performance test using very noisy synthetic data shows that the method works even in case of very poor quality data characterized by very low signal to noise ratio (SNR). Second, we have improved CICA automatic procedure recovering the information on the amplitudes of the extracted independent components. This is crucial for further analysis, starting from a prompt estimate of magnitude/energy of the highlighted events. Data used for the present analysis were collected by four broadband three-component seismic stations (ASB2, AMS2, TAGG, BGNG) belonging to the Campi Flegrei seismic monitoring network, managed by the 'Istituto Nazionale di Geofisica e Vulcanologia-Osservatorio Vesuviano (INGV-OV)' (see for

  13. Application of MintDrive Automatic Precision Positioning System

    Institute of Scientific and Technical Information of China (English)

    Wu Fengming; Yang Yonggang; Zhao Xiaolong; Zhang Zhiyuan

    2004-01-01

    It is very important to locate batteries accurately and quickly during automatic battery production.Unstable or inaccurate location will negatively influence battery's consistency, quality and finished product rate.A traditional way is using sensor to detect and locate batteries directly , but because of the detecting tolerance, setting them on a fixed point exactly is almost impossible.This problem could be completely solved by the application of mint drive automatic accurate servo locating system.Firstly operating software WorkBench test was applied to collocate the servo locating driver for a most optimized control.Then based on the requirement of real location, program and test the locating action with a programming software and finally upload all the locating information to MicroLogix 1200 PLC, the PLC will control the running on each station telling when to locate, where is the location and how to eliminate bad parts.For this intelligent servo locating system has the advantages of powerful function, simple operation, high controlling and locating accuracy and easy maintenance, it is very suitable to be adopted by automatic battery making line.It is regarded as a very advanced method of control currently for reducing waste material due to inaccurate location and tough adjustment.

  14. Pilot control through the TAFCOS automatic flight control system

    Science.gov (United States)

    Wehrend, W. R., Jr.

    1979-01-01

    The set of flight control logic used in a recently completed flight test program to evaluate the total automatic flight control system (TAFCOS) with the controller operating in a fully automatic mode, was used to perform an unmanned simulation on an IBM 360 computer in which the TAFCOS concept was extended to provide a multilevel pilot interface. A pilot TAFCOS interface for direct pilot control by use of a velocity-control-wheel-steering mode was defined as well as a means for calling up conventional autopilot modes. It is concluded that the TAFCOS structure is easily adaptable to the addition of a pilot control through a stick-wheel-throttle control similar to conventional airplane controls. Conventional autopilot modes, such as airspeed-hold, altitude-hold, heading-hold, and flight path angle-hold, can also be included.

  15. Semi-automatic volumetrics system to parcellate ROI on neocortex

    Science.gov (United States)

    Tan, Ou; Ichimiya, Tetsuya; Yasuno, Fumihiko; Suhara, Tetsuya

    2002-05-01

    A template-based and semi-automatic volumetrics system--BrainVol is build to divide the any given patient brain to neo-cortical and sub-cortical regions. The standard region is given as standard ROI drawn on a standard brain volume. After normalization between the standard MR image and the patient MR image, the sub-cortical ROIs' boundary are refined based on gray matter. The neo-cortical ROIs are refined by sulcus information that is semi-automatically marked on the patient brain. Then the segmentation is applied to 4D PET image of same patient for calculation of TAC (Time Activity Curve) by co-registration between MR and PET.

  16. Real-time data collection of scour at bridges

    Science.gov (United States)

    Mueller, David S.; Landers, Mark N.

    1994-01-01

    The record flood on the Mississippi River during the summer of 1993 provided a rare opportunity to collect data on scour of the streambed at bridges and to test data collection equipment under extreme hydraulic conditions. Detailed bathymetric and hydraulic information were collected at two bridges crossing the Mississippi River during the rising limb, near the peak, and during the recession of the flood. Bathymetric data were collected using a digital echo sounder. Three-dimensional velocities were collected using Broadband Acoustic Doppler Current Profilers (BB-ADCP) operating at 300 kilohertz (kHz), 600 kHz, and 1,200 kHz. Positioning of the data collected was measured using a range-azimuth tracking system and two global positioning systems (GPS). Although differential GPS was able to provide accurate positions and tracking information during approach- and exit-reach data collection, it was unable to maintain lock on a sufficient number of satellites when the survey vessel was under the bridge or near the piers. The range-azimuth tracking system was used to collect position and tracking information for detailed data collection near the bridge piers. These detailed data indicated local scour ranging from 3 to 8 meters and will permit a field-based evaluation of the ability of various numerical models to compute the hydraulics, depth, geometry, and time-dependent development of local scour.

  17. MCTSSA Software Reliability Handbook, Volume II: Data Collection Demonstration and Software Reliability Modeling for a Multi-Function Distributed System

    OpenAIRE

    Schneidewind, Norman F.

    1997-01-01

    The purpose of this handbook is threefold. Specifically, it: Serves as a reference guide for implementing standard software reliability practices at Marine Corps Tactical Systems Support Activity and aids in applying the software reliability model; Serves as a tool for managing the software reliability program; and Serves as a training aid. U.S. Marine Corps Tactical Systems Support Activity, Camp Pendleton, CA. RLACH

  18. Proportional directional valve based automatic steering system for tractors

    Institute of Scientific and Technical Information of China (English)

    Jin-yi LIU; Jing-quan TAN; En-rong MAO; Zheng-he SONG; Zhong-xiang ZHU‡

    2016-01-01

    Most automatic steering systems for large tractors are designed with hydraulic systems that run on either constant flow or constant pressure. Such designs are limited in adaptability and applicability. Moreover, their control valves can unload in the neutral position and eventually lead to serious hydraulic leakage over long operation periods. In response to the problems noted above, a multifunctional automatic hydraulic steering circuit is presented. The system design is composed of a 5-way-3-position proportional directional valve, two pilot-controlled check valves, a pressure-compensated directional valve, a pressure-compensated flow regulator valve, a load shuttle valve, and a check valve, among other components. It is adaptable to most open-center systems with constant flow supply and closed-center systems with load feedback. The design maintains the lowest pressure under load feedback and stays at the neutral position during unloading, thus meeting the requirements for steering. The steering controller is based on proportional-integral-derivative (PID) running on a 51-microcontroller-unit master control chip. An experimental platform is developed to establish the basic characteristics of the system subject to stepwise inputs and sinusoi-dal tracking. Test results show that the system design demonstrates excellent control accuracy, fast response, and negligible leak during long operation periods.

  19. Complete automatic target cuer/recognition system for tactical forward-looking infrared images

    Science.gov (United States)

    Ernisse, Brian E.; Rogers, Steven K.; DeSimio, Martin P.; Raines, Richard A.

    1997-09-01

    A complete forward-looking IR (FLIR) automatic target cuer/recognizer (ATC/R) is presented. The data used for development and testing of this ATC/R are first generation FLIR images collected using a F-15E. The database contains thousands of images with various mission profiles and target arrangements. The specific target of interest is a mobile missile launcher, the primary target. The goal is to locate all vehicles (secondary targets) within a scene and identify the primary targets. The system developed and tested includes an image segmenter, region cluster algorithm, feature extractor, and classifier. Conventional image processing algorithms in conjunction with neural network techniques are used to form a complete ATC/R system. The conventional techniques include hit/miss filtering, difference of Gaussian filtering, and region clustering. A neural network (multilayer perceptron) is used for classification. These algorithms are developed, tested and then combined into a functional ATC/R system. Overall primary target detection rate (cuer) is 84% with a 69% primary target identification (recognizer) rate at ranges relevant to munitions release. Furthermore, the false alarm rate (a nontarget cued as a target) in only 2.3 per scene. The research is being completed with a 10 flight test profile using third generation FLIR images.

  20. Design of automatic thruster assisted mooring systems for ships

    Directory of Open Access Journals (Sweden)

    Jan P. Strand

    1998-04-01

    Full Text Available This paper addresses the mathematical modelling and controller design of an automatic thruster assisted position mooring system. Such control systems are applied to anchored floating production offloading and storage vessels and semi-subs. The controller is designed using model based control with a LQG feedback controller in conjunction with a Kalman filter. The controller design is in addition to the environmental loads accounting for the mooring forces acting on the vessel. This is reflected in the model structure and in the inclusion of new functionality.

  1. Automatic System for Serving and Deploying Products into Advertising Space

    OpenAIRE

    Lepen, Nejc

    2014-01-01

    The purpose of the thesis is to present the problems of deploying and serving products into advertising space,encountered daily by online marketers,planners and leaseholders of advertising spaces.The aim of the thesis is to solve the problem in question with the help of a novel web application.Therefore,we have designed an automatic system,which consists of three key components:an online store,a surveillance system and websites accommodating advertising space.In the course of this thesis,we h...

  2. A low-power and miniaturized electrocardiograph data collection system with smart textile electrodes for monitoring of cardiac function.

    Science.gov (United States)

    Dai, Ming; Xiao, Xueliang; Chen, Xin; Lin, Haoming; Wu, Wanqing; Chen, Siping

    2016-12-01

    With the increasing aging population as well as health concerns, chronic heart disease has become the focus of public attention. A comfortable, low-powered, and wearable electrocardiogram (ECG) system for continuously monitoring the elderly's ECG signals over several hours is important for preventing cardiovascular diseases. Traditional ECG monitoring apparatus is often inconvenient to carry, has many electrodes to attach to the chest, and has a high-power consumption. There is also a challenge to design an electrocardiograph that satisfies requirements such as comfort, confinement, and compactness. Based on these considerations, this study presents a biosensor acquisition system for wearable, ubiquitous healthcare applications using three textile electrodes and a recording circuit specialized for ECG monitoring. In addition, several methods were adopted to reduce the power consumption of the device. The proposed system is composed of three parts: (1) an ECG analog front end (AFE), (2) digital signal processing and micro-control circuits, and (3) system software. Digital filter methods were used to eliminate the baseline wander, skin contact noise, and other interfering signals. A comparative study was conducted using this system to observe its performance with two commercial Holter monitors. The experimental results demonstrated that the total power consumption of this proposed system in a full round of ECG acquisition was only 29.74 mW. In addition, this low-power system performed well and stably measured the heart rate with an accuracy of 98.55 %. It can also contain a real-time dynamic display with organic light-emitting diodes (OLED) and wirelessly transmit information via a Bluetooth 4.0 module.

  3. Development of a digital signal processor-based new 12-lead synchronization electrocardiogram automatic analysis system.

    Science.gov (United States)

    Yang, Yuxing; Yin, Dongyuan; Freyer, Richard

    2002-07-01

    This paper presents a digital signal processor (DSP)-based new multichannel electrocardiogram (ECG) system for 12-lead synchronization ECG automatic analysis in real-time with high sampling rate at 1000 Hz and 12-bits precision. Using the hardware structure of double-CPU based on Microprocessor (MPU) 89C55 and DSP TMS320F206 combines the powerful control ability of MPU with DSPs fast computation ability. Fully utilizing the double-CPUs resource, the system can distribute the reasonable CPU-time for the real-time tasks of multichannel synchronization ECG sampling, digital filter, data storing, waveform automatic analysis and print at high sampling rate. The digital ECG system has the advantages of simple structure, sampling with high speed and precision, powerful real-time processing ability and good quality. The paper discusses the system's principle and the skilful hardware design, also gives the ECG processing using the fast simple integer-coefficient filter method and the automatic calculation algorithms of the ECG parameters such as heart rate, P-R interval, Q-T interval and deflexion angle of ECG-axis etc. The system had been successfully tested and used in the ECG automatic analysis instrument.

  4. 30 CFR 75.1103-3 - Automatic fire sensor and warning device systems; minimum requirements; general.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensor and warning device...-UNDERGROUND COAL MINES Fire Protection § 75.1103-3 Automatic fire sensor and warning device systems; minimum requirements; general. Automatic fire sensor and warning device systems installed in belt haulageways...

  5. 30 CFR 75.1103-6 - Automatic fire sensors; actuation of fire suppression systems.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensors; actuation of fire... Protection § 75.1103-6 Automatic fire sensors; actuation of fire suppression systems. Point-type heat sensors or automatic fire sensor and warning device systems may be used to actuate deluge-type water...

  6. Research on HJ-1A/B satellite data automatic geometric precision correction design

    Institute of Scientific and Technical Information of China (English)

    Xiong Wencheng; Shen Wenming; Wang Qiao; Shi Yuanli; Xiao Rulin; Fu Zhuo

    2014-01-01

    Developed independently by China,HJ-1A/B satellites have operated well on-orbit for five years and acquired a large number of high-quality observation data. The realization of the observation data geometric precision correction is of great significance for macro and dynamic ecological environment monitoring. The pa-per analyzed the parameter characteristics of HJ-1 satellite and geometric features of HJ-1 satellite level 2 data (systematic geo-corrected data). Based on this,the overall HJ-1 multi-sensor geometric correction flow and charge-coupled device (CCD) automatic geometric precision correction method were designed. Actual operating data showed that the method could achieve good result for automatic geometric precision correction of HJ-1 sat-ellite data,automatic HJ-1 CCD image geometric precision correction accuracy could be achieved within two pixels and automatic matching accuracy between the images of same satellite could be obtained less than one pixel.

  7. Automatic Emboli Detection System for the Artificial Heart

    Science.gov (United States)

    Steifer, T.; Lewandowski, M.; Karwat, P.; Gawlikowski, M.

    In spite of the progress in material engineering and ventricular assist devices construction, thromboembolism remains the most crucial problem in mechanical heart supporting systems. Therefore, the ability to monitor the patient's blood for clot formation should be considered an important factor in development of heart supporting systems. The well-known methods for automatic embolus detection are based on the monitoring of the ultrasound Doppler signal. A working system utilizing ultrasound Doppler is being developed for the purpose of flow estimation and emboli detection in the clinical artificial heart ReligaHeart EXT. Thesystem will be based on the existing dual channel multi-gate Doppler device with RF digital processing. A specially developed clamp-on cannula probe, equipped with 2 - 4 MHz piezoceramic transducers, enables easy system setup. We present the issuesrelated to the development of automatic emboli detection via Doppler measurements. We consider several algorithms for the flow estimation and emboli detection. We discuss their efficiency and confront them with the requirements of our experimental setup. Theoretical considerations are then met with preliminary experimental findings from a) flow studies with blood mimicking fluid and b) in-vitro flow studies with animal blood. Finally, we discuss some more methodological issues - we consider several possible approaches to the problem of verification of the accuracy of the detection system.

  8. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    events. Due to great variation in events, this method often fails to detect biologically relevant pressure variations. We have tried to develop a new concept for recognition of pressure events based on a neural network. Pressures were recorded for over 23 hours in 29 normal volunteers by means...... of a portable data recording system. A number of pressure events and non-events were selected from 9 recordings and used for training the network. The performance of the trained network was then verified on recordings from the remaining 20 volunteers. The accuracy and sensitivity of the two systems were...

  9. Automatic and controlled processing in the corticocerebellar system.

    Science.gov (United States)

    Ramnani, Narender

    2014-01-01

    During learning, performance changes often involve a transition from controlled processing in which performance is flexible and responsive to ongoing error feedback, but effortful and slow, to a state in which processing becomes swift and automatic. In this state, performance is unencumbered by the requirement to process feedback, but its insensitivity to feedback reduces its flexibility. Many properties of automatic processing are similar to those that one would expect of forward models, and many have suggested that these may be instantiated in cerebellar circuitry. Since hierarchically organized frontal lobe areas can both send and receive commands, I discuss the possibility that they can act both as controllers and controlled objects and that their behaviors can be independently modeled by forward models in cerebellar circuits. Since areas of the prefrontal cortex contribute to this hierarchically organized system and send outputs to the cerebellar cortex, I suggest that the cerebellum is likely to contribute to the automation of cognitive skills, and to the formation of habitual behavior which is resistant to error feedback. An important prerequisite to these ideas is that cerebellar circuitry should have access to higher order error feedback that signals the success or failure of cognitive processing. I have discussed the pathways through which such feedback could arrive via the inferior olive and the dopamine system. Cerebellar outputs inhibit both the inferior olive and the dopamine system. It is possible that learned representations in the cerebellum use this as a mechanism to suppress the processing of feedback in other parts of the nervous system. Thus, cerebellar processes that control automatic performance may be completed without triggering the engagement of controlled processes by prefrontal mechanisms.

  10. Automatic Feature Detection, Description and Matching from Mobile Laser Scanning Data and Aerial Imagery

    Science.gov (United States)

    Hussnain, Zille; Oude Elberink, Sander; Vosselman, George

    2016-06-01

    In mobile laser scanning systems, the platform's position is measured by GNSS and IMU, which is often not reliable in urban areas. Consequently, derived Mobile Laser Scanning Point Cloud (MLSPC) lacks expected positioning reliability and accuracy. Many of the current solutions are either semi-automatic or unable to achieve pixel level accuracy. We propose an automatic feature extraction method which involves utilizing corresponding aerial images as a reference data set. The proposed method comprise three steps; image feature detection, description and matching between corresponding patches of nadir aerial and MLSPC ortho images. In the data pre-processing step the MLSPC is patch-wise cropped and converted to ortho images. Furthermore, each aerial image patch covering the area of the corresponding MLSPC patch is also cropped from the aerial image. For feature detection, we implemented an adaptive variant of Harris-operator to automatically detect corner feature points on the vertices of road markings. In feature description phase, we used the LATCH binary descriptor, which is robust to data from different sensors. For descriptor matching, we developed an outlier filtering technique, which exploits the arrangements of relative Euclidean-distances and angles between corresponding sets of feature points. We found that the positioning accuracy of the computed correspondence has achieved the pixel level accuracy, where the image resolution is 12cm. Furthermore, the developed approach is reliable when enough road markings are available in the data sets. We conclude that, in urban areas, the developed approach can reliably extract features necessary to improve the MLSPC accuracy to pixel level.

  11. Data Collection System for Laser Power Based on LabVIEW%基于LabVIEW的激光功率数据采集系统

    Institute of Scientific and Technical Information of China (English)

    薛竣文; 裴雪丹; 苏秉华; 孙鲁; 赵慧元; 苏禹

    2012-01-01

    Based on LabVIEW graphic design language a data collection system for laser power is designed to solve the problem of diode-pumped all-solid-state laser long term stability of laser power.Laser power meter used to convert optical parameter to electric parameter is designed by silicon photocell.The filter,attenuation glass and ground glass have to be selected carefully in order to make the photocell work in the unsaturated linear region.A micro control unit of STC89C52RC is used to manage the signal.The computer communicates with micro control unit by serial mode.Data collection interface is programmed by LabVIEW.The data collection system for the long term stability of laser power is realized at last.%针对全固体激光器的长期功率稳定性检测问题,设计并实现了基于LabVIEW图形设计语言的激光功率数据采集系统。采用光电池制作的激光功率计来实现光功率到电学参量的转换,通过合理的选取激光功率计的滤光片、衰减片和毛玻璃,保证光电池工作在非饱和的线性区。使用STC89C52RC单片机对采集到的信号进行处理,计算机与单片机采用串口通讯,数据采集界面程序采用LabVIEW编写,最终实现了激光功率长期稳定性监测系统的设计、调试与制作。

  12. Automatic concrete cracks detection and mapping of terrestrial laser scan data

    Directory of Open Access Journals (Sweden)

    Mostafa Rabah

    2013-12-01

    The current paper submits a method for automatic concrete cracks detection and mapping from the data that was obtained during laser scanning survey. The method of cracks detection and mapping is achieved by three steps, namely the step of shading correction in the original image, step of crack detection and finally step of crack mapping and processing steps. The detected crack is defined in a pixel coordinate system. To remap the crack into the referred coordinate system, a reverse engineering is used. This is achieved by a hybrid concept of terrestrial laser-scanner point clouds and the corresponding camera image, i.e. a conversion from the pixel coordinate system to the terrestrial laser-scanner or global coordinate system. The results of the experiment show that the mean differences between terrestrial laser scan and the total station are about 30.5, 16.4 and 14.3 mms in x, y and z direction, respectively.

  13. Physical and biological data collected along the Texas, Mississippi, and Florida Gulf coasts in the Gulf of Mexico as part of the Harmful Algal BloomS Observing System from 19 Aug 1953 to 11 July 2014 (NODC Accession 0120767)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — HABSOS (Harmful Algal BloomS Observing System) is a data collection and distribution system for harmful algal bloom (HAB) information in the Gulf of Mexico. The goal...

  14. Semi-automatic Annotation System for OWL-based Semantic Search

    Directory of Open Access Journals (Sweden)

    C.-H. Liu

    2009-11-01

    Full Text Available Current keyword search by Google, Yahoo, and so on gives enormous unsuitable results. A solution to this perhaps is to annotate semantics to textual web data to enable semantic search, rather than keyword search. However, pure manual annotation is very time-consuming. Further, searching high level concept such as metaphor cannot be done if the annotation is done at a low abstraction level. We, thus, present a semi-automatic annotation system, i.e. an automatic annotator and a manual annotator. Against the web ontology language (OWL terms defined by Protégé, the former annotates the textual web data using the Knuth-Morris-Pratt (KMP algorithm, while the latter allows a user to use the terms to annotate metaphors with high abstraction. The resulting semantically-enhanced textual web document can be semantically processed by other web services such as the information retrieval system and the recommendation system shown in our example.

  15. Design of automatic labeling system on the end surfaces of bundles of round steels

    Directory of Open Access Journals (Sweden)

    Fuxiang ZHANG

    2016-12-01

    Full Text Available To achieve automatic labeling on the end surfaces of bundles of round steels for the steel plants, on the basis of the analysis of round steel production process, a set of automatic system for labeling on the round steel end surfaces of bundles is designed. The system includes the robot visual location unit, the label supply unit, the pressure supply unit, the automatic labeling unit, the laser ranging unit, and the host computer communication control unit, etc.. In the system, the robot visual location unit provides the round steel center location, and the automatic labeling unit implements automatic labeling on the round steel. The system is tested under lab condition, which shows the system can effectively solve the artificial labeling problems such as fault paste and leakage paste of workers, and realize efficient and stable automatic labeling. The system can be used in sleel plants for automatic labeling on the end surfaces of bundles of round steels.

  16. Collection and Management of Shop-floor ControllerData

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    This paper describes shop-floor data collection and management. An arc hitecture is presented for a shop-floor data collection and management system b a sed on the analysis of the features of these data. Two critical aspects of the s ystem are discussed in detail: the various communication protocols between compu ters and machines, and the real-time demands of the shop-floor controller.

  17. Goal-Oriented Data Collection Framework in Configuration Projects

    DEFF Research Database (Denmark)

    Shafiee, Sara; Hvam, Lars; Kristjansdottir, Katrin

    2015-01-01

    of the PCS project is needed. The framework was developed based on the current literature in the field and revised during testing at a case company. The framework has proven to provide a structural approach for data collection, which saved the company both time and money in the initial phases of the PCS......This article proposes a systematic framework for data collection when executing Product Configuration System (PCS) projects. Since the data collection in PCS is one of the most time consuming tasks, a systematic framework to handle and manage the large amount of complex data in the early stages...

  18. Automatic interpretation of magnetic data using Euler deconvolution with nonlinear background

    Digital Repository Service at National Institute of Oceanography (India)

    Dewangan, P.; Ramprasad, T.; Ramana, M.V.; Desa, M.; Shailaja, B.

    The voluminous gravity and magnetic data sets demand automatic interpretation techniques like Naudy, Euler and Werner deconvolution. Of these techniques, the Euler deconvolution has become a popular choice because the method assumes no particular...

  19. AROMA: Automatic Generation of Radio Maps for Localization Systems

    CERN Document Server

    Eleryan, Ahmed; Youssef, Moustafa

    2010-01-01

    WLAN localization has become an active research field recently. Due to the wide WLAN deployment, WLAN localization provides ubiquitous coverage and adds to the value of the wireless network by providing the location of its users without using any additional hardware. However, WLAN localization systems usually require constructing a radio map, which is a major barrier of WLAN localization systems' deployment. The radio map stores information about the signal strength from different signal strength streams at selected locations in the site of interest. Typical construction of a radio map involves measurements and calibrations making it a tedious and time-consuming operation. In this paper, we present the AROMA system that automatically constructs accurate active and passive radio maps for both device-based and device-free WLAN localization systems. AROMA has three main goals: high accuracy, low computational requirements, and minimum user overhead. To achieve high accuracy, AROMA uses 3D ray tracing enhanced wi...

  20. Automatic spreader-container alignment system using infrared structured lights.

    Science.gov (United States)

    Liu, Yu; Wang, Yibo; Lv, Jimin; Zhang, Maojun

    2012-06-01

    This paper presents a computer-vision system to assist reach stackers to automatically align the spreader with the target container. By analyzing infrared lines on the top of the container, the proposed system is able to calculate the relative position between the spreader and the container. The invisible structured lights are equipped in this system to enable all-weather operation, which can avoid environmental factors such as shadows and differences in climate. Additionally, the lateral inclination of the spreader is taken into consideration to offer a more accurate alignment than other competing systems. Estimation errors are reduced through approaches including power series and linear regression. The accuracy can be controlled within 2 cm or 2 deg, which meets the requirements of reach stackers' operation.

  1. Automatic diagnosis and control of distributed solid state lighting systems.

    Science.gov (United States)

    Dong, Jianfei; van Driel, Willem; Zhang, Guoqi

    2011-03-28

    This paper describes a new design concept of automatically diagnosing and compensating LED degradations in distributed solid state lighting (SSL) systems. A failed LED may significantly reduce the overall illumination level, and destroy the uniform illumination distribution achieved by a nominal system. To our knowledge, an automatic scheme to compensate LED degradations has not yet been seen in the literature, which requires a diagnostic step followed by control reconfigurations. The main challenge in diagnosing LED degradations lies in the usually unsatisfactory observability in a distributed SSL system, because the LED light output is usually not individually measured. In this work, we tackle this difficulty by using pulse width modulated (PWM) drive currents with a unique fundamental frequency assigned to each LED. Signal processing methods are applied in estimating the individual illumination flux of each LED. Statistical tests are developed to diagnose the degradation of LEDs. Duty cycle of the drive current signal to each LED is re-optimized once a fault is detected, in order to compensate the destruction of the uniform illumination pattern by the failed LED.

  2. Automatic Intruder Combat System: A way to Smart Border Surveillance

    Directory of Open Access Journals (Sweden)

    Dushyant Kumar Singh

    2016-12-01

    Full Text Available Security and safeguard of international borders have always been a dominant issue for every nation. A large part of a nation’s budget is provided to its defense system. Besides wars, illegal intrusion in terms of terrorism is a critical matter that causes severe harm to nation’s property. In India’s perspective, border patrolling by Border Security Forces (BSF has already been practiced from a long time for surveillance. The patrolling parties are equipped with high-end surveillance equipments but yet an alternative to the ply of huge manpower and that too in harsh environmental conditions hasn’t been in existence. An automatic mechanism for smart surveillance and combat is proposed in this paper as a solution to the above-discussed problems. Smart surveillance requires automatic intrusion detection in the surveillance video, which is achieved by using optical flow information as motion features for intruder/human in the scene. The use of optical flow in the proposed smart surveillance makes it robust and more accurate. Use of a simple horizontal feature for fence detection makes system simple and faster to work in real-time. System is also designed to respond against the activities of intruders, of which auto combat is one kind of response.

  3. A Study of Applications of Multiagent System Specifications and the Key Techniques in Automatic Abstracts System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this thesis, multiagent system specifications, multiagent system architecture s , agent communication languages and agent communication protocols, automatic abs tracting based on multiagent technologies are studied. Some concerned problems o f designs and realization of automatic abstracting systems based on multiagent t echnologies are studied, too. Chapter 1 shows the significance and objectives of the thesis, its main contents are summarized, and innovations of the thesis are showed. Some basic concepts of agents and multiagent systems are studied in Cha pter 2. The definitions of agents and multiagent systems are given, and the theo ry, technologies and applications of multiagent systems are summarized. Furtherm ore, some important studying trends of multiagent systems are set forward. Multi agent system specifications are studied in Chapter 3. MAS/KIB—a multiagent syst em specification is built using mental states such as K(Know), B(Belief) , and I(Intention), its grammar and semanteme are discussed, axioms and infe rence rules are given, and some properties are researched. We also compare MAS/K IB with other existing specifications. MAS/KIB has the following characteristics : (1) each agent has its own world outlook; (2) no global data in the system; (3 ) processes of state changes are used as indexes to systems; (4) it has the char acteristics of not only time series logic but also dynamic logic; and (5) intera ctive actions are included. The architectures of multiagent systems are studied in Chapter 4. First, we review some typical architecture of multiagent systems, agent network architecture, agent federated architecture, agent blackboard archi tecture, and Foundation of Intelligent Physical Agent(FIPA) architecture. For th e first time, we set forward and study the layering and partitioning models of t he architectures of multiagent systems, organizing architecture models, and inte roperability architecture model of multiagent

  4. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    Science.gov (United States)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  5. Automatic pterygium detection on cornea images to enhance computer-aided cortical cataract grading system.

    Science.gov (United States)

    Gao, Xinting; Wong, Damon Wing Kee; Aryaputera, Aloysius Wishnu; Sun, Ying; Cheng, Ching-Yu; Cheung, Carol; Wong, Tien Yin

    2012-01-01

    In this paper, we present a new method to detect pterygiums using cornea images. Due to the similarity of appearances and spatial locations between pterygiums and cortical cataracts, pterygiums are often falsely detected as cortical cataracts on retroillumination images by a computer-aided grading system. The proposed method can be used to filter out the pterygium which improves the accuracy of cortical cataract grading system. This work has three major contributions. First, we propose a new pupil segmentation method for visible wavelength images. Second, an automatic detection method of pterygiums is proposed. Third, we develop an enhanced compute-aided cortical cataract grading system that excludes pterygiums. The proposed method is tested using clinical data and the experimental results demonstrate that the proposed method can improve the existing automatic cortical cataract grading system.

  6. Automatic Data Filter Customization Using a Genetic Algorithm

    Science.gov (United States)

    Mandrake, Lukas

    2013-01-01

    This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.

  7. 30 CFR 75.1103-4 - Automatic fire sensor and warning device systems; installation; minimum requirements.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic fire sensor and warning device...-UNDERGROUND COAL MINES Fire Protection § 75.1103-4 Automatic fire sensor and warning device systems; installation; minimum requirements. (a) Effective December 31, 2009, automatic fire sensor and warning...

  8. Automatic convey or System with In–Process Sorting Mechanism using PLC and HMI System

    Directory of Open Access Journals (Sweden)

    Y V Aruna

    2015-11-01

    Full Text Available Programmable logic controllers are widely used in many manufacturing process like machinery packaging material handling automatic assembly. These are special type of microprocessor based controller used for any application that needs any kind of electrical controller including lighting controller and HVAC control system. Automatic conveyor system is a computerized control method of controlling and managing the sorting mechanism at the same time maintaining the efficiency of the industry & quality of the products.HMI for automatic conveyor system is considered the primary way of controlling each operation. Text displays are available as well as graphical touch screens. It is used in touch panels and local monitoring of machines. This paper deals with the efficient use of PLC in automatic conveyor system and also building the accuracy in it.

  9. Collection of VLE data for acid gas - alkanolamine systems using Fourier transform infrared spectroscopy. Final report, September 29, 1990--September 30, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Bullin, J.A.; Rogers, W.J.

    1996-11-01

    This report describes research from September 29, 1990 through September 30, 1996, involving the development a novel Fourier transform infrared (FTIR) spectroscopic apparatus and method for measuring vapor - liquid equilibrium (VLE) systems of carbon dioxide and hydrogen sulfide with aqueous alkanolamine solutions. The original apparatus was developed and modified as it was used to collect VLE data on acid gas systems. Vapor and liquid calibrations were performed for spectral measurements of hydrogen sulfide and carbon dioxide in the vapor and in solution with aqueous diethanolamine (DEA) and methyldiethanolamine (MDEA). VLE measurements were made of systems of hydrogen sulfide and carbon dioxide in 20 wt % DEA at 50{degrees}C and 40{degrees}C. VLE measurements were made of systems of hydrogen sulfide and carbon dioxide in 50 wt% and 23 wt% MDEA at 40{degrees}C and in 23 wt% MDEA at 50{degrees}C. VLE measurements were made of systems of hydrogen sulfide and carbon dioxide in 35 wt% MDEA + 5 wt% DEA and in 35 wt% MDEA + 10 wt% DEA at 40{degrees}C and 50{degrees}C. Measurements were made of residual amounts of carbon dioxide in each VLE system. The new FTIR spectrometer is now a consistently working and performing apparatus.

  10. Data collection on risk factors in pregnancy

    NARCIS (Netherlands)

    Zetstra-van der Woude, Alethea Priscilla

    2016-01-01

    This thesis aims to investigate the different methods of data collection of risk factors in pregnancy. Several observational epidemiologic study designs were used to assess associations between risk factors and negative birth outcomes. We especially looked at the use of folic acid around pregnancy a

  11. Training course 'Fisheries data collection and analysis'

    NARCIS (Netherlands)

    Heijden, van der P.G.M.

    2007-01-01

    Course description of the course “Fisheries data collection and analysis”, held from October 1st till October 19th 2007 and organised by the Programme for Capacity development & Institutional Change of Wageningen International in cooperation with Wageningen University – Aquaculture and Fisheries

  12. Real-time directional wave data collection

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G.; Pednekar, P.S.

    The wave measurements carried out along the east and west coasts off India at 13 locations using the directional waverider buoys are referred in this paper. The total number of buoy days are 4501 and out of which the data collected are 4218 days...

  13. An Introduction to BYOE Mobile Data Collection

    Science.gov (United States)

    Rocchio, Rose A.

    2014-01-01

    Smartphone ownership among college-aged Americans is high and growing, and many students own more than one mobile device. Such devices are increasingly incorporated into the academic lives of students, and the era of "bring your own everything" presents new opportunities and challenges for higher education. Mobile data collection is the…

  14. A closed-loop automatic control system for high-intensity acoustic test systems.

    Science.gov (United States)

    Slusser, R. A.

    1973-01-01

    Sound at sound pressure levels in the range from 130 to 160 dB is used in the investigation. Random noise is passed through a series of parallel filters, generally 1/3-octave wide. A basic automatic system is investigated because of preadjustment inaccuracies and high costs found in a study of a typical manually controlled acoustic testing system. The unit described has been successfully used in automatic acoustic tests in connection with the spacecraft tests for the Mariner 1971 program.

  15. Modeling of a Multiple Digital Automatic Gain Control System

    Institute of Scientific and Technical Information of China (English)

    WANG Jingdian; LU Xiuhong; ZHANG Li

    2008-01-01

    Automatic gain control (AGC) has been used in many applications. The key features of AGC, including a steady state output and static/dynamic timing response, depend mainly on key parameters such as the reference and the filter coefficients. A simple model developed to describe AGC systems based on several simple assumptions shows that AGC always converges to the reference and that the timing constant depends on the filter coefficients. Measures are given to prevent oscillations and limit cycle effects. The simple AGC system is adapted to a multiple AGC system for a TV tuner in a much more efficient model. Simulations using the C language are 16 times faster than those with MATLAB, and 10 times faster than those with a mixed register transfer level (RTL)-simulation program with integrated circuit emphasis (SPICE) model.

  16. preAssemble: a tool for automatic sequencer trace data processing

    Directory of Open Access Journals (Sweden)

    Laerdahl Jon K

    2006-01-01

    Full Text Available Abstract Background Trace or chromatogram files (raw data are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling. This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages – Phred and Staden are used by preAssemble to perform sequence quality processing. Results The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. Conclusion preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence

  17. Automatic fracture density update using smart well data and artificial neural networks

    Science.gov (United States)

    Al-Anazi, A.; Babadagli, T.

    2010-03-01

    This paper presents a new methodology to continuously update and improve fracture network models. We begin with a hypothetical model whose fracture network parameters and geological information are known. After generating the "exact" fracture network with known characteristics, the data were exported to a reservoir simulator and simulations were run over a period of time. Intelligent wells equipped with downhole multiple pressure and flow sensors were placed throughout the reservoir and put into production. These producers were completed in different fracture zones to create a representative pressure and production response. We then considered a number of wells of which static (cores and well logs) and dynamic (production) data were used to model well fracture density. As new wells were opened, historical static and dynamic data from previous wells and static data from the new wells were used to update the fracture density using Artificial Neural Networks (ANN). The accuracy of the prediction model depends significantly on the representation of the available data of the existing fracture network. The importance of conventional data (surface production data) and smart well data prediction capability was also investigated. Highly sensitive input data were selected through a forward selection scheme to train the ANN. Well geometric locations were included as a new link in the ANN regression process. Once the relationship between fracture network parameters and well performance data was established, the ANN model was used to predict fracture density at newly drilled locations. Finally, an error analysis through a correlation coefficient and percentage absolute relative error performance was performed to examine the accuracy of the proposed inverse modeling methodology. It was shown that fracture dominated production performance data collected from both conventional and smart wells allow for automatically updating the fracture network model. The proposed technique helps

  18. Data mining spacecraft telemetry: towards generic solutions to automatic health monitoring and status characterisation

    Science.gov (United States)

    Royer, P.; De Ridder, J.; Vandenbussche, B.; Regibo, S.; Huygen, R.; De Meester, W.; Evans, D. J.; Martinez, J.; Korte-Stapff, M.

    2016-07-01

    We present the first results of a study aimed at finding new and efficient ways to automatically process spacecraft telemetry for automatic health monitoring. The goal is to reduce the load on the flight control team while extending the "checkability" to the entire telemetry database, and provide efficient, robust and more accurate detection of anomalies in near real time. We present a set of effective methods to (a) detect outliers in the telemetry or in its statistical properties, (b) uncover and visualise special properties of the telemetry and (c) detect new behavior. Our results are structured around two main families of solutions. For parameters visiting a restricted set of signal values, i.e. all status parameters and about one third of all the others, we focus on a transition analysis, exploiting properties of Poincare plots. For parameters with an arbitrarily high number of possible signal values, we describe the statistical properties of the signal via its Kernel Density Estimate. We demonstrate that this allows for a generic and dynamic approach of the soft-limit definition. Thanks to a much more accurate description of the signal and of its time evolution, we are more sensitive and more responsive to outliers than the traditional checks against hard limits. Our methods were validated on two years of Venus Express telemetry. They are generic for assisting in health monitoring of any complex system with large amounts of diagnostic sensor data. Not only spacecraft systems but also present-day astronomical observatories can benefit from them.

  19. An Automatic Testing System of Scheduling Strategies in Real-Time UNIX

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper proposes a formal model of the automatic testing system for scheduling strategies in real-time UNIX and describes the algorithm of the key part of the system. The model of the system is an important technology of the automatization of software development. According to the model presented in the paper, many different kinds of automatic testing systems can be designed and developed easily. At the end of the paper, the prototype proves the feasibility of the model and design.

  20. Enhancement of the automatic ultrasonic signal processing system using digital technology

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Park, H. Y.; Suh, Y. S.; Kim, D. Hoon; Huh, S.; Sung, S. H.; Jang, G. S.; Ryoo, S. G.; Choi, J. H.; Kim, Y. H.; Lee, J. C.; Kim, D. Hyun; Park, H. J.; Kim, Y. C.; Lee, J. P.; Park, C. H.; Kim, M. S

    1999-12-01

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  1. Automatic Vehicle License Recognition Based on Video Vehicular Detection System

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaoxuan; CHEN Yang; HE Yinghua; WU Jun

    2006-01-01

    Traditional methods of license character extraction cannot meet the requirements of recognition accuracy and speed rendered by the video vehicular detection system.Therefore, a license plate localization method based on multi-scale edge detection and a character segmentation algorithm based on Markov random field model is presented.Results of experiments demonstrate that the method yields more accurate license character extraction in contrast to traditional localization method based on edge detection by difference operator and character segmentation based on threshold.The accuracy increases from 90% to 94% under preferable illumination, while under poor condition, it increases more than 5%.When the two improved algorithms are used, the accuracy and speed of automatic license recognition meet the system's requirement even under the noisy circumstance or uneven illumination.

  2. Automatic Voltage Control (AVC) System under Uncertainty from Wind Power

    DEFF Research Database (Denmark)

    Qin, Nan; Abildgaard, Hans; Flynn, Damian

    2016-01-01

    An automatic voltage control (AVC) system maintains the voltage profile of a power system in an acceptable range and minimizes the operational cost by coordinating the regulation of controllable components. Typically, all of the parameters in the optimization problem are assumed to be certain...... and constant in the decision making process. However, for high shares of wind power, uncertainty in the decision process due to wind power variability may result in an infeasible AVC solution. This paper proposes a voltage control approach which considers the voltage uncertainty from wind power productions....... The proposed method improves the performance and the robustness of a scenario based approach by estimating the potential voltage variations due to fluctuating wind power production, and introduces a voltage margin to protect the decision against uncertainty for each scenario. The effectiveness of the proposed...

  3. Automatic Meter Reading and Theft Control System by Using GSM

    Directory of Open Access Journals (Sweden)

    P. Rakesh Malhotra

    2013-04-01

    Full Text Available This paper deals with automatic meter reading and theft control system in energy meter. Current transformer is used to measure the total power consumption for house or industrial purpose. This recorded reading is transmitted to the electricity board for every 60 days once. For transmitting the reading of energy meter GSM module is used. To avoid theft, infrared sensor is placed in the screw portion of energy meter seal. If the screw is removed from the meter a message is sent to the electricity board. The measuring of energy meter and monitoring of IR sensor is done with a PIC microcontroller.The informative system will be helpful for the electricity board to monitor the entire supply and the correct billing accordingly without any mishap. This model reduces the manual manipulation work andtheft control.

  4. Entrance C - New Automatic Number Plate Recognition System

    CERN Multimedia

    2013-01-01

    Entrance C (Satigny) is now equipped with a latest-generation Automatic Number Plate Recognition (ANPR) system and a fast-action road gate.   During the month of August, Entrance C will be continuously open from 7.00 a.m. to 7.00 p.m. (working days only). The security guards will open the gate as usual from 7.00 a.m. to 9.00 a.m. and from 5.00 p.m. to 7.00 p.m. For the rest of the working day (9.00 a.m. to 5.00 p.m.) the gate will operate automatically. Please observe the following points:       Stop at the STOP sign on the ground     Position yourself next to the card reader for optimal recognition     Motorcyclists must use their CERN card     Cyclists may not activate the gate and should use the bicycle turnstile     Keep a safe distance from the vehicle in front of you   If access is denied, please check that your vehicle regist...

  5. Feature Extraction and Automatic Material Classification of Underground Objects from Ground Penetrating Radar Data

    Directory of Open Access Journals (Sweden)

    Qingqing Lu

    2014-01-01

    Full Text Available Ground penetrating radar (GPR is a powerful tool for detecting objects buried underground. However, the interpretation of the acquired signals remains a challenging task since an experienced user is required to manage the entire operation. Particularly difficult is the classification of the material type of underground objects in noisy environment. This paper proposes a new feature extraction method. First, discrete wavelet transform (DWT transforms A-Scan data and approximation coefficients are extracted. Then, fractional Fourier transform (FRFT is used to transform approximation coefficients into fractional domain and we extract features. The features are supplied to the support vector machine (SVM classifiers to automatically identify underground objects material. Experiment results show that the proposed feature-based SVM system has good performances in classification accuracy compared to statistical and frequency domain feature-based SVM system in noisy environment and the classification accuracy of features proposed in this paper has little relationship with the SVM models.

  6. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  7. 基于时钟同步技术在数据采集系统中的应用%Clock Signal Synchronization Technology for Data Collection System

    Institute of Scientific and Technical Information of China (English)

    范业明; 刘增武

    2011-01-01

    A design of system clock signal synchronization for data collection system is presented. In order to enhance the system clock signal and improve the reliability of the system, field-programmable gate array (FPGA) is used as the control core instead of conventional processor. The design based on using phase lock loop (PLL) and Verilog hardware description language to realize the clock synchronization reset function. Practice proves that the design is stable and suitable for working under the clock work at high speed.%介绍了一种系统时钟信号同步设计.为了提高系统时钟同步技术以及系统的可靠性,以现场可编程阵列(FPGA)代替传统的处理器为控制核心,采用锁相环(PLL)和Verilog硬件描述语言进行设计,达到复位实现时钟同步目的.实践证明,该设计运行稳定,可靠性强,适合在高速丁作时钟下工作.

  8. Automatic correction system of the laminating machine based on image processing%基于图像处理的贴合机自动纠偏系统

    Institute of Scientific and Technical Information of China (English)

    赵茹; 陶晓杰; 王鹏飞

    2013-01-01

    贴合机自动纠偏系统主要用来对采集的图像进行X、Y、R方向的偏差计算和偏差数据的输出.文中首先对贴合机自动纠偏系统进行了设计,并对采集到的图像进行预处理,之后采用Hough变换对图像进行角度检测,采用插值和相位相关的图像配准算法对图像进行位移偏差计算.最后对系统进行标定.通过传送偏移数据到控制机,可实现对贴合物品的精确定位.%Automatic correction system of the laminating machine is main used to output the deviation calculation and deviation data of the X, Y, R direction of the acquired images. Firstly, in the thesis, laminating machine automatic correction system has been designed, and collected images has been preprocessed by the automatic correction system. After that we use the Hough transform to carry out angle detection of the image. Interpolation and image registration algorithm of phase correlation have been used in displacement deviation calculation. Finally is the system calibration. By transmitting the offset data to the control computer, It is easy to achieving precise positioning of the laminated slices.

  9. Practical approach to data collection in a European dialysis network

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    @@ Collection of Dialysis Treatment Data and Patient-Related Information in a Single Dialysis Unit State-of-the-art dialysis machines are capable of automatically collecting huge volumes of dialysis treatment data (blood and dialysate flow rates, venous and arterial pressures, ultrafiltration rate, ultrafiltration volume, transmembrane pressure, conductivity, etc.) and displaying the information on the monitor screen. Commercially available software allows this information to be exported to a clinical database on a server through a local network in the clinic, enabling healthcare professionals to further process, display, and evaluate the data. Moreover, on-line sensors can be used to monitor patient-related information, e.g.effective ionic dialysance as a marker of urea clearance[1-3], urea removal and additional parameters(e.g. whole-body Kt/V) based on dialysate urea measurements[4,5], and blood volume changes[6]. Although many of these sensors are available in dialysis machines, their clinical value in routine dialysis is still subject to debate[7].

  10. ISS Habitability Data Collection and Preliminary Findings

    Science.gov (United States)

    Thaxton, Sherry (Principal Investigator); Greene, Maya; Schuh, Susan; Williams, Thomas; Archer, Ronald; Vasser, Katie

    2017-01-01

    Habitability is the relationship between an individual and their surroundings (i.e. the interplay of the person, machines, environment, and mission). The purpose of this study is to assess habitability and human factors on the ISS to better prepare for future long-duration space flights. Scheduled data collection sessions primarily require the use of iSHORT (iPad app) to capture near real-time habitability feedback and analyze vehicle layout and space utilization.

  11. 14 CFR 25.672 - Stability augmentation and automatic and power-operated systems.

    Science.gov (United States)

    2010-01-01

    ... power-operated systems. 25.672 Section 25.672 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Construction Control Systems § 25.672 Stability augmentation and automatic and power-operated systems. If the functioning of stability augmentation or other automatic or power-operated systems is necessary to...

  12. 14 CFR 29.672 - Stability augmentation, automatic, and power-operated systems.

    Science.gov (United States)

    2010-01-01

    ... power-operated systems. 29.672 Section 29.672 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Construction Control Systems § 29.672 Stability augmentation, automatic, and power-operated systems. If the functioning of stability augmentation or other automatic or power-operated system is necessary to...

  13. 14 CFR 27.672 - Stability augmentation, automatic, and power-operated systems.

    Science.gov (United States)

    2010-01-01

    ... power-operated systems. 27.672 Section 27.672 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... Construction Control Systems § 27.672 Stability augmentation, automatic, and power-operated systems. If the functioning of stability augmentation or other automatic or power-operated systems is necessary to...

  14. 环保监测无线数据采集传输系统%Environmental Monitoring Wireless Data Collective Transmission System

    Institute of Scientific and Technical Information of China (English)

    许红宁

    2011-01-01

    这是针对各类污染源在线监测而设计出的一款高性能无线数据采集传输设备。可实现对一次仪表的实时监控,通过传输网络将监测数据及时传送到监控中心;同时能够接收、执行监控中心发出的各种指令,实现远程监控、实时监测、超标报警。环保监测无线数据采集传输系统通过模拟信号接口、数字信号接口与流量计、COD、PH仪、氨氮、余氯、烟气测量仪等多种仪器连接,使得对仪表的监控更加方便快捷,满足环保领域各级国控、省控及市控污染源在线监测的要求。%Aiming at a high-function wirelss data collective transmisstion equipment which focuses on all kinds of pollutions to realize the instant monitoring of the instrument and transmit the monitoring data to the monitoring center through transmission network, in the meanwhile receive and carry out the orders from the monitoring center and finally realize distance monitoring,instant monitoring, over-standard alarm.Through analog signal inerface,digital singal interface and all kinds of instrument such as flowmeter, COD,PH device, ammonia antrogen, residual chlorine and smoke monitoring,the environmental monitoring wireless data collective transmission system can monitor the instrument more conveniently and meets the requirements of the contry control,province control and city control.

  15. EpiCollect: linking smartphones to web applications for epidemiology, ecology and community data collection.

    Directory of Open Access Journals (Sweden)

    David M Aanensen

    Full Text Available BACKGROUND: Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. METHODOLOGY: Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth. Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. CONCLUSIONS: Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.

  16. Data Collection for Mobile Group Consumption: An Asynchronous Distributed Approach.

    Science.gov (United States)

    Zhu, Weiping; Chen, Weiran; Hu, Zhejie; Li, Zuoyou; Liang, Yue; Chen, Jiaojiao

    2016-04-06

    Mobile group consumption refers to consumption by a group of people, such as a couple, a family, colleagues and friends, based on mobile communications. It differs from consumption only involving individuals, because of the complex relations among group members. Existing data collection systems for mobile group consumption are centralized, which has the disadvantages of being a performance bottleneck, having single-point failure and increasing business and security risks. Moreover, these data collection systems are based on a synchronized clock, which is often unrealistic because of hardware constraints, privacy concerns or synchronization cost. In this paper, we propose the first asynchronous distributed approach to collecting data generated by mobile group consumption. We formally built a system model thereof based on asynchronous distributed communication. We then designed a simulation system for the model for which we propose a three-layer solution framework. After that, we describe how to detect the causality relation of two/three gathering events that happened in the system based on the collected data. Various definitions of causality relations based on asynchronous distributed communication are supported. Extensive simulation results show that the proposed approach is effective for data collection relating to mobile group consumption.

  17. Automatic Alignment System for the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Wilhlelmsen, K C; Awwal, A S; Ferguson, S W; Horowitz, B; Miller Kamm, V J; Reynolds, C A

    2007-10-04

    The automatic alignment system for the National Ignition Facility (NIF) is a large-scale parallel system that directs all 192 laser beams along the 300-m optical path to a 50-micron focus at target chamber in less than 30 minutes. The system commands 9,000 stepping motors to adjust mirrors and other optics. Twenty-two control loops per beamline request image processing services running on a LINUX cluster to analyze high-resolution images of the beam and references. Process-leveling assures the computational load is evenly spread on the cluster. Algorithms also estimate measurement accuracy and reject off-normal images. One challenge to achieving rapid alignment of beams in parallel is the efficient coordination of shared laser devices, such as sensors that are configurable to monitor multiple beams. Contention for shared resources is managed by the Component Mediation System, which precludes deadlocks and optimizes device motions using a hierarchical component structure. A reservation service provided by the software framework prevents interference from competing instances of automated controls or from the actions of system operators. The design, architecture and performance of the system will be discussed.

  18. Automatic Translation of Arabic Sign to Arabic Text (ATASAT System

    Directory of Open Access Journals (Sweden)

    Abdelmoty M.Ahmed

    2016-04-01

    Full Text Available Sign language continues to be the preferred tool of communication between the deaf and the hearing-impaired. It is a well-structured code by h and gesture, where every gesture has a specific meaning, In this paper has goal to develop a system for automatic translation of Arabic Sign Language. To Arabic Text (ATASAT System this system is acts as a translator among deaf and dumb with normal people to enhance their commun ication, the proposed System consists of five main stages Video and Images capture, Video an d images processing, Hand Signs Construction, Classification finally Text transform ation and interpretation, this system depends on building a two datasets image features for Arabi c sign language gestures alphabets from two resources: Arabic Sign Language dictionary and gest ures from different signer's human, also using gesture recognition techniques, which allows the user to interact with the outside world. This system offers a novel technique of hand detect ion is proposed which detect and extract hand gestures of Arabic Sign from Image or video, i n this paper we use a set of appropriate features in step hand sign construction and class ification of based on different classification algorithms such as KNN, MLP, C4.5, VFI and SMO and compare these results to get better classifier.

  19. Towards Automatic Improvement of Patient Queries in Health Retrieval Systems

    Directory of Open Access Journals (Sweden)

    Nesrine KSENTINI

    2016-07-01

    Full Text Available With the adoption of health information technology for clinical health, e-health is becoming usual practice today. Users of this technology find it difficult to seek information relevant to their needs due to the increasing amount of the clinical and medical data on the web, and the lack of knowledge of medical jargon. In this regards, a method is described to improve user's needs by automatically adding new related terms to their queries which appear in the same context of the original query in order to improve final search results. This method is based on the assessment of semantic relationships defined by a proposed statistical method between a set of terms or keywords. Experiments were performed on CLEF-eHealth-2015 database and the obtained results show the effectiveness of our proposed method.

  20. Automatized system for realizing solar irradiation maps of lands

    Energy Technology Data Exchange (ETDEWEB)

    Biasini, A.; Fanucci, O.; Visentin, R.

    In this work are explained in detail the methodological, operating and graphic proceedings for the realization of ''solar irradiation maps'' of the Italian territory. Starting from a topographic presentation of level curves, the graphic results are elaborated by means of acquisition, classification, digitalization and automatic superimposition processes of data concerning the orography of the place (classes of slope and exposition of slopes) and the following association to the areas located in this way of the corresponding classes of the relative solar irradiation. The method has been applied and tested successfully in an area of about 400 km/sup 2/ corresponding to the quadrant NE of the leaf I.G.M.I. nr. 221 scale 1:100,000 because it seems to represent almost all possible combinations between slopes and exposition. In this work is reported an example of the final result and the intermediate stage.