WorldWideScience

Sample records for automatic data collection systems

  1. Automatic Weissenberg data collection system for time-resolved protein crystallography

    CERN Document Server

    Sakabe, N; Higashi, T; Igarashi, N; Suzuki, M; Watanabe, N; Sasaki, K

    2001-01-01

    A totally new type of fully automatic Weissenberg data-collection system called 'Galaxy' was developed and was installed at the Photon Factory. This automatic data collection system consists of a rotated-inclined focusing monochromator, a screenless Weissenberg type camera, an image reader, an eraser, a cassette transportation mechanism, a control console and a safety and high-speed computer network system linking a control console, data processing computers and data servers. The special characteristics of this system are a Weissenberg camera with a fully cylindrical cassette which can be rotated to exchange a frame, a maximum number of 36 images to be recorded in an IP cassette, and a very high speed IP reader with five reading heads. Since the frame exchange time is only a few seconds, this system is applicable for time-resolved protein crystallography at seconds or minutes of time-scale.

  2. ARCPAS - Automatic radiation control point access system an automated data collection terminal for radiation dose and access control

    International Nuclear Information System (INIS)

    Nuclear facilities such as nuclear power plants or fuel processing facilities are required to maintain accurate records of personnel access, exposure and work performed. Most facilities today have some sort of computerized data collection system for radiation dose and access control. The great majority rely on handwritten records, i.e., dose card or sign-in sheet which in turn are transferred to a computerized records management system manually. The ARCPAS terminal provides a method for automating personnel exposure data collection and processing. The terminal is a user interactive device which contains a unit for automatically reading and zeroing pocket dosemeters, a security badge reader for personnel identification, a 16 digit key pad for RWP information entry, a high resolution color CRT for interactive communication and a high speed tape printer providing an entry chit. The chit provides the individual worker with a record of the transaction including an individual identifying number, remaining dose for the quarter or period and RWP under which the worker entered the controlled area. The purpose of automating the access control is to provide fast, accurate, realtime data to the records management system. A secondary purpose is to relieve trained health physics technicians of control point duties so that their training and skills can be utilized more effectively in a facility's health physics program

  3. Automatic Identification And Data Collection Via Barcode Laser Scanning.

    Science.gov (United States)

    Jacobeus, Michel

    1986-07-01

    How to earn over 100 million a year by investing 40 million ? No this is not the latest Wall Street "tip" but the costsavings obtained by the U.S. Department of Defense. 2 % savings on annual turnover claim supermarkets ! Millions of Dollars saved report automotive companies ! These are not daydreams, but tangible results measured by users after implemen-ting Automatic Identification and Data Collection systems, based on bar codes. To paraphrase the famous sentence "I think, thus I am", with AI/ADC systems "You knonw, thus you are". Indeed, in today's world, an immediate, accurate and precise information is a vital management need for companies growth and survival. AI/ADC techniques fullfill these objectives by supplying automatically and without any delay nor alteration the right information.

  4. AUTORED - the JADE automatic data reduction system

    International Nuclear Information System (INIS)

    The design and implementation of and experience with an automatic data processing system for the reduction of data from the JADE experiment at DESY is described. The central elements are a database and a job submitter which combine powerfully to minimise the need for manual intervention. (author)

  5. Design and implementation of automatic color information collection system

    Science.gov (United States)

    Ci, Wenjie; Xie, Kai; Li, Tong

    2015-12-01

    In liquid crystal display (LCD) colorimetric characterization, it needs to convert RGB the device-dependent color space to CIEXYZ or CIELab the device-independent color space. Namely establishing the relationship between RGB and CIE using the data of device color and the corresponding data of CIE. Thus a color automatic message acquisition software is designed. We use openGL to fulfill the full screen display function, write c++ program and call the Eyeone equipment library functions to accomplish the equipment calibration, set the sample types, and realize functions such as sampling and preservation. The software can drive monitors or projectors display the set of sample colors automatically and collect the corresponding CIE values. The sample color of RGB values and the acquisition of CIE values can be stored in a text document, which is convenient for future extraction and analysis. Taking the cubic polynomial as an example, each channel is sampled of 17 sets using this system. And 100 sets of test data are also sampled. Using the least square method we can get the model. The average of color differences are around 2.4874, which is much lower than the CIE2000 commonly required level of 6.00.The successful implementation of the system saves the time of sample color data acquisition, and improves the efficiency of LCD colorimetric characterization.

  6. MAC, A System for Automatically IPR Identification, Collection and Distribution

    Science.gov (United States)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  7. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  8. ATCOM: Automatically Tuned Collective Communication System for SMP Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Meng-Shiou Wu

    2005-12-17

    Conventional implementations of collective communications are based on point-to-point communications, and their optimizations have been focused on efficiency of those communication algorithms. However, point-to-point communications are not the optimal choice for modern computing clusters of SMPs due to their two-level communication structure. In recent years, a few research efforts have investigated efficient collective communications for SMP clusters. This dissertation is focused on platform-independent algorithms and implementations in this area. There are two main approaches to implementing efficient collective communications for clusters of SMPs: using shared memory operations for intra-node communications, and overlapping inter-node/intra-node communications. The former fully utilizes the hardware based shared memory of an SMP, and the latter takes advantage of the inherent hierarchy of the communications within a cluster of SMPs. Previous studies focused on clusters of SMP from certain vendors. However, the previously proposed methods are not portable to other systems. Because the performance optimization issue is very complicated and the developing process is very time consuming, it is highly desired to have self-tuning, platform-independent implementations. As proven in this dissertation, such an implementation can significantly out-perform the other point-to-point based portable implementations and some platform-specific implementations. The dissertation describes in detail the architecture of the platform-independent implementation. There are four system components: shared memory-based collective communications, overlapping mechanisms for inter-node and intra-node communications, a prediction-based tuning module and a micro-benchmark based tuning module. Each component is carefully designed with the goal of automatic tuning in mind.

  9. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    International Nuclear Information System (INIS)

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics

  10. A geological and geophysical data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    Sudhakar, T.; Afzulpurkar, S.

    A geological and geophysical data collection system using a Personal Computer is described below. The system stores data obtained from various survey systems typically installed in a charter vessel and can be used for similar applications on any...

  11. Colorized linear CCD data acquisition system with automatic exposure control

    Science.gov (United States)

    Li, Xiaofan; Sui, Xiubao

    2014-11-01

    Colorized linear cameras deliver superb color fidelity at the fastest line rates in the industrial inspection. It's RGB trilinear sensor eliminates image artifacts by placing a separate row of pixels for each color on a single sensor. It's advanced design minimizes distance between rows to minimize image artifacts due to synchronization. In this paper, the high-speed colorized linear CCD data acquisition system was designed take advantages of the linear CCD sensor μpd3728. The hardware and software design of the system based on FPGA is introduced and the design of the functional modules is performed. The all system is composed of CCD driver module, data buffering module, data processing module and computer interface module. The image data was transferred to computer by Camera link interface. The system which automatically adjusts the exposure time of linear CCD, is realized with a new method. The integral time of CCD can be controlled by the program. The method can automatically adjust the integration time for different illumination intensity under controlling of FPGA, and respond quickly to brightness changes. The data acquisition system is also offering programmable gains and offsets for each color. The quality of image can be improved after calibration in FPGA. The design has high expansibility and application value. It can be used in many application situations.

  12. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  13. The study of data collection method for the plasma properties collection and evaluation system from web

    Science.gov (United States)

    Park, Jun-Hyoung; Song, Mi-Young; Plasma Fundamental Technology Research Team

    2015-09-01

    Plasma databases are necessarily required to compute the plasma parameters and high reliable databases are closely related with accuracy enhancement of simulations. Therefore, a major concern of plasma properties collection and evaluation system is to create a sustainable and useful research environment for plasma data. The system has a commitment to provide not only numerical data but also bibliographic data (including DOI information). Originally, our collection data methods were done by manual data search. In some cases, it took a long time to find data. We will be find data more automatically and quickly than legacy methods by crawling or search engine such as Lucene.

  14. Automatic fault detection on BIPV systems without solar irradiation data

    CERN Document Server

    Leloux, Jonathan; Luna, Alberto; Desportes, Adrien

    2014-01-01

    BIPV systems are small PV generation units spread out over the territory, and whose characteristics are very diverse. This makes difficult a cost-effective procedure for monitoring, fault detection, performance analyses, operation and maintenance. As a result, many problems affecting BIPV systems go undetected. In order to carry out effective automatic fault detection procedures, we need a performance indicator that is reliable and that can be applied on many PV systems at a very low cost. The existing approaches for analyzing the performance of PV systems are often based on the Performance Ratio (PR), whose accuracy depends on good solar irradiation data, which in turn can be very difficult to obtain or cost-prohibitive for the BIPV owner. We present an alternative fault detection procedure based on a performance indicator that can be constructed on the sole basis of the energy production data measured at the BIPV systems. This procedure does not require the input of operating conditions data, such as solar ...

  15. NLO error propagation exercise data collection system

    International Nuclear Information System (INIS)

    A combined automated and manual system for data collection is described. The system is suitable for collecting, storing, and retrieving data related to nuclear material control at a bulk processing facility. The system, which was applied to the NLO operated Feed Materials Production Center, was successfully demonstrated for a selected portion of the facility. The instrumentation consisted of off-the-shelf commercial equipment and provided timeliness, convenience, and efficiency in providing information for generating a material balance and performing error propagation on a sound statistical basis

  16. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Bowler, Matthew W., E-mail: mbowler@embl.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 avenue des Martyrs, F-38042 Grenoble (France); Université Grenoble Alpes-EMBL-CNRS, 71 avenue des Martyrs, F-38042 Grenoble (France); Nurizzo, Didier, E-mail: mbowler@embl.fr; Barrett, Ray; Beteva, Antonia; Bodin, Marjolaine [European Synchrotron Radiation Facility, 71 avenue des Martyrs, F-38043 Grenoble (France)

    2015-10-03

    MASSIF-1 (ID30A-1) is a new beamline dedicated to the completely automatic characterization and data collection from crystals of biological macromolecules. MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined.

  17. MASSIF-1: a beamline dedicated to the fully automatic characterization and data collection from crystals of biological macromolecules

    International Nuclear Information System (INIS)

    MASSIF-1 (ID30A-1) is a new beamline dedicated to the completely automatic characterization and data collection from crystals of biological macromolecules. MASSIF-1 (ID30A-1) is an ESRF undulator beamline operating at a fixed wavelength of 0.969 Å (12.8 keV) that is dedicated to the completely automatic characterization of and data collection from crystals of biological macromolecules. The first of the ESRF Upgrade MASSIF beamlines to be commissioned, it has been open since September 2014, providing a unique automated data collection service to academic and industrial users. Here, the beamline characteristics and details of the new service are outlined

  18. Automatic Boat Identification System for VIIRS Low Light Imaging Data

    Directory of Open Access Journals (Sweden)

    Christopher D. Elvidge

    2015-03-01

    Full Text Available The ability for satellite sensors to detect lit fishing boats has been known since the 1970s. However, the use of the observations has been limited by the lack of an automatic algorithm for reporting the location and brightness of offshore lighting features arising from boats. An examination of lit fishing boat features in Visible Infrared Imaging Radiometer Suite (VIIRS day/night band (DNB data indicates that the features are essentially spikes. We have developed a set of algorithms for automatic detection of spikes and characterization of the sharpness of spike features. A spike detection algorithm generates a list of candidate boat detections. A second algorithm measures the height of the spikes for the discard of ionospheric energetic particle detections and to rate boat detections as either strong or weak. A sharpness index is used to label boat detections that appear blurry due to the scattering of light by clouds. The candidate spikes are then filtered to remove features on land and gas flares. A validation study conducted using analyst selected boat detections found the automatic algorithm detected 99.3% of the reference pixel set. VIIRS boat detection data can provide fishery agencies with up-to-date information of fishing boat activity and changes in this activity in response to new regulations and enforcement regimes. The data can provide indications of illegal fishing activity in restricted areas and incursions across Exclusive Economic Zone (EEZ boundaries. VIIRS boat detections occur widely offshore from East and Southeast Asia, South America and several other regions.

  19. VXIbus data collection system -- A design study

    International Nuclear Information System (INIS)

    The German support program has sponsored the work to investigate the VXIbus as integration platform for safeguards instrumentation. This paper will cover the analysis of the user requirements for a VXIbus based monitoring system for integrated safeguards -- primarily for reliable unattended in-field collection of large amounts of data. The goal is to develop a suitable system architecture. The design of the system makes use of the VXIbus standard as the selected hardware platform Based upon the requirement analysis and the overriding need for high reliability and robustness, a systematic investigation of different operating system options, as well as development and integration tools will be considered. For the software implementation cycle high and low level programming tools are required. The identification of the constraints for the programming platform and the tool selection will be presented. Both the strategic approach, the rules for analysis and design work as well as the executive components for the support of the implementation and production cycle are given. Here all the conditions for reliable, unattended and integrated safeguards monitoring systems will be addressed. The definition of the basic and advanced design principles are covered. The paper discusses the results of a study on a system produced to demonstrate a high data rate timer/counter application

  20. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  1. Nuclear demagnetization refrigerator with automatic control, pick up and data process system

    International Nuclear Information System (INIS)

    A nuclear demagnetization refrigerator for various physical research at ultralow temperatures with automatic control, pick up and data process system is developed . The design of the main units and performance of the refrigerator and automatic system are described. The possibilities of the set-up operation in various regimes are analyzed for the case of NMR investigation of helium quantum crystals

  2. Integrated system to automatize information collecting for the primary health care at home.

    Science.gov (United States)

    Oliveira, Edson N; Cainelli, Jean; Pinto, Maria Eugênia B; Cazella, Silvio C; Dahmer, Alessandra

    2013-01-01

    Data collected in a consistent manner is the basis for any decision making. This article presents a system that automates data collection by community-based health workers during their visits to the residences of users of the Brazilian Health Care System (Sistema Único de Saúde - SUS) The automated process will reduce the possibility of mistakes in the transcription of visit information and make information readily available to the Ministry of Health. Furthermore, the analysis of the information provided via this system can be useful in the implementation of health campaigns and in the control of outbreaks of epidemiological diseases. PMID:23920593

  3. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  4. Protokol Interchangeable Data pada VMeS (Vessel Messaging System) dan AIS (Automatic Identification System)

    OpenAIRE

    Farid Andhika; Trika Pitana; Achmad Affandi

    2012-01-01

    VMeS (Vessel Messaging System) merupakan komunikasi berbasis radio untuk mengirimkan pesan antara VMeS terminal kapal di laut dengan VMeS gateway di darat. Dalam perkembangan sistem monitoring kapal di laut umumnya menggunakan AIS (Automatic Identification System) yang telah digunakan di seluruh pelabuhan untuk memantau kondisi kapal dan mencegah tabrakan antar kapal. Dalam penelitian ini akan dirancang format data yang sesuai untuk VMeS agar bisa dilakukan proses interchangeable ke AIS sehin...

  5. 78 FR 68816 - Proposed Information Collection; Comment Request; NOAA Space-Based Data Collection System (DCS...

    Science.gov (United States)

    2013-11-15

    ... National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; NOAA Space- Based Data Collection System (DCS) Agreements AGENCY: National Oceanic and Atmospheric... National Ocean and Atmospheric Administration (NOAA) operates two space-based data collection systems...

  6. 75 FR 59686 - Proposed Information Collection; Comment Request; NOAA Space-Based Data Collection System (DCS...

    Science.gov (United States)

    2010-09-28

    ... National Oceanic and Atmospheric Administration Proposed Information Collection; Comment Request; NOAA Space- Based Data Collection System (DCS) Agreements AGENCY: National Oceanic and Atmospheric... space-based data collection systems (DCS), the Geostationary Operational Environmental Satellite...

  7. Dialog system for automatic data input/output and processing with two BESM-6 computers

    International Nuclear Information System (INIS)

    This paper presents a system for conducting experiments with fully automatic processing of data from multichannel recorders in the dialog mode. The system acquires data at a rate of 2.5 . 103 readings/sec, processes in real time, and outputs digital and graphical material in a multitasking environment

  8. Automatic testing system design and data analysis of permafrost temperature in Qinghai-Tibet Railway

    Institute of Scientific and Technical Information of China (English)

    尚迎春; 齐红元

    2008-01-01

    Aimed at the characteristics of permafrost temperature influencing the safety of Qinghai-Tibet Railway and its on-line testing system, comparing the achievement of permafrost study nationwide with those worldwide, an automatic testing system of permafrost temperature, containing a master computer and some slave computers, was designed. By choosing high-precise thermistors as temperature sensor, designing and positioning the depth and interval of testing sections, testing, keeping and sending permafrost temperature data at time over slave computers, and receiving, processing and analyzing the data of collecting permafrost temperature over master computer, the change of the permafrost temperature can be described and analyzed, which can provide information for permafrost railway engineering design. Moreover, by taking permafrost temperature testing in a certain section of Qinghai-Tibet Railway as an instance, the collected data of permafrost temperature were analyzed, and the effect of permafrost behavior was depicted under the railway, as well as, a BP model was set up to predict the permafrost characteristics. This testing system will provide information timely about the change of the permafrost to support the safety operation in Qinghai-Tibet Railway.

  9. Automatic data collection by a low-power microprocessor on the Italian buoy

    International Nuclear Information System (INIS)

    A low-power data acquisition system for stations not attended, marine platforms or buoys is presented here. The prototype has been installed on the oceanographic buoy ODAS Italia 1. The data acquisition system utilizes an IM6100 (INTERSIL) microprocessor with 2 k-words of random access memory (RAM) and 2 k-words of erasable programmable read-only memory (EPROM) (1 word = 12 bits). The program fo the acquisition of the data, the control of the device, the transmission to the earth receiving station and the reception of commands from the earth station is memorized in the read-only memories. It becomes operative when the power is turned on. (author)

  10. Sensor Systems Collect Critical Aerodynamics Data

    Science.gov (United States)

    2010-01-01

    With the support of Small Business Innovation Research (SBIR) contracts with Dryden Flight Research Center, Tao of Systems Integration Inc. developed sensors and other components that will ultimately form a first-of-its-kind, closed-loop system for detecting, measuring, and controlling aerodynamic forces and moments in flight. The Hampton, Virginia-based company commercialized three of the four planned components, which provide sensing solutions for customers such as Boeing, General Electric, and BMW and are used for applications such as improving wind turbine operation and optimizing air flow from air conditioning systems. The completed system may one day enable flexible-wing aircraft with flight capabilities like those of birds.

  11. EXPERIMENTAL INVESTIGATION OF TIME DELAYS DATA TRANSMISSION IN AUTOMATIC CONTROL SYSTEMS

    OpenAIRE

    RYABENKIY Vladimir Mikhailovich; USHKARENKO Alexander Olegovich

    2015-01-01

    The method of statistical analysis of time-delay transmission of information and control packets over the network in automatic control systems are considered in this article. The results of measurements of time delays, which are obtained based on the analytic dependence of the probability density allow theoretically determine the time delays for data transmission.

  12. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    International Nuclear Information System (INIS)

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described

  13. Protokol Interchangeable Data pada VMeS (Vessel Messaging System dan AIS (Automatic Identification System

    Directory of Open Access Journals (Sweden)

    Farid Andhika

    2012-09-01

    Full Text Available VMeS (Vessel Messaging System merupakan komunikasi berbasis radio untuk mengirimkan pesan antara VMeS terminal kapal di laut dengan VMeS gateway di darat. Dalam perkembangan sistem monitoring kapal di laut umumnya menggunakan AIS (Automatic Identification System yang telah digunakan di seluruh pelabuhan untuk memantau kondisi kapal dan mencegah tabrakan antar kapal. Dalam penelitian ini akan dirancang format data yang sesuai untuk VMeS agar bisa dilakukan proses interchangeable ke AIS sehingga bisa dibaca oleh AIS receiver yang ditujukan untuk kapal dengan ukuran dibawah 30 GT (Gross Tonnage. Format data VmeS dirancang dalam tiga jenis yaitu data posisi, data informasi kapal dan data pesan pendek yang akan dilakukan interchangeable dengan AIS tipe 1,4 dan 8. Pengujian kinerja sistem interchangeable menunjukkan bahwa dengan peningkatan periode pengiriman pesan maka lama delay total meningkat tetapi packet loss menurun. Pada pengiriman pesan setiap 5 detik dengan kecepatan 0-40 km/jam, 96,67 % data dapat diterima dengan baik. Data akan mengalami packet loss jika level daya terima dibawah -112 dBm . Jarak terjauh yang dapat dijangkau modem dengan kondisi bergerak yaitu informatika ITS dengan jarak 530 meter terhadap Laboratorium B406 dengan level daya terima -110 dBm.

  14. [The design of smart data collection system for EHR].

    Science.gov (United States)

    Li, Xue-yi; Yan, Zhuang-zhi; Lv, Hai-jiao

    2009-11-01

    This paper presents the design of smart data collection system to deal with the drawbacks like EHR data second hand input which is time-consuming and EHR's poor usability. The system comprises security certification, information management and data transmission modules. Through WLAN, EHR data can transmit between smart terminals and server. Also it provides a real-time communication platform for both terminal users and medical care personnel. PMID:20352914

  15. Volunteer-based distributed traffic data collection system

    DEFF Research Database (Denmark)

    Balachandran, Katheepan; Broberg, Jacob Honoré; Revsbech, Kasper;

    2010-01-01

    An architecture for a traffic data collection system is proposed, which can collect data without having access to a backbone network. Contrary to other monitoring systems it relies on volunteers to install a program on their own computers, which will capture incoming and outgoing packets, group...... them into flows and send the flow data to a central server. Data can be used for studying and characterising internet traffic and for testing traffic models by regenerating real traffic. The architecture is designed to have efficient and light usage of resources on both client and server sides. Worst...

  16. Data Collection via Synthetic Aperture Radiometry towards Global System

    Directory of Open Access Journals (Sweden)

    Ali. A. J.Al-Sabbagh

    2015-10-01

    Full Text Available Nowadays it is widely accepted that remote sensing is an efficient way of large data management philosophy. In this paper, we present a future view of the big data collection by synthetic aperture radiometry as a passive microwave remote sensing towards building a global monitoring system. Since the collected data may not have any value, it is mandatory to analyses these data in order to get valuable and beneficial information with respect to their base data. The collected data by synthetic aperture radiometry is one of the high resolution earth observation, these data will be an intensive problems, Meanwhile, Synthetic Aperture Radar able to work in several bands, X, C, S, L and P-band. The important role of synthetic aperture radiometry is how to collect data from areas with inadequate network infrastructures where the ground network facilities were destroyed. The future concern is to establish a new global data management system, which is supported by the groups of international teams working to develop technology based on international regulations. There is no doubt that the existing techniques are so limited to solve big data problems totally. There is a lot of work towards improving 2- D and 3-D SAR to get better resolution.

  17. AUTOMATIC RECOGNITION OF PIPING SYSTEM FROM LARGE-SCALE TERRESTRIAL LASER SCAN DATA

    Directory of Open Access Journals (Sweden)

    K. Kawashima

    2012-09-01

    Full Text Available Recently, changes in plant equipment have been becoming more frequent because of the short lifetime of the products, and constructing 3D shape models of existing plants (as-built models from large-scale laser scanned data is expected to make their rebuilding processes more efficient. However, the laser scanned data of the existing plant has massive points, captures tangled objects and includes a large amount of noises, so that the manual reconstruction of a 3D model is very time-consuming and costs a lot. Piping systems especially, account for the greatest proportion of plant equipment. Therefore, the purpose of this research was to propose an algorithm which can automatically recognize a piping system from terrestrial laser scan data of the plant equipment. The straight portion of pipes, connecting parts and connection relationship of the piping system can be recognized in this algorithm. Eigenvalue analysis of the point clouds and of the normal vectors allows for the recognition. Using only point clouds, the recognition algorithm can be applied to registered point clouds and can be performed in a fully automatic way. The preliminary results of the recognition for large-scale scanned data from an oil rig plant have shown the effectiveness of the algorithm.

  18. System of automatic control over data Acquisition and Transmission to IGR NNC RK Data Center

    International Nuclear Information System (INIS)

    Automated system for seismic and acoustic data acquisition and transmission in real time was established in Data Center IGR NNC RK, which functions very successively. The system monitors quality and volume of acquired information and also controls the status of the system and communication channels. Statistical data on system operation are accumulated in created database. Information on system status is reflected on the Center Web page. (author)

  19. A speech recognition system for data collection in precision agriculture

    Science.gov (United States)

    Dux, David Lee

    Agricultural producers have shown interest in collecting detailed, accurate, and meaningful field data through field scouting, but scouting is labor intensive. They use yield monitor attachments to collect weed and other field data while driving equipment. However, distractions from using a keyboard or buttons while driving can lead to driving errors or missed data points. At Purdue University, researchers have developed an ASR system to allow equipment operators to collect georeferenced data while keeping hands and eyes on the machine during harvesting and to ease georeferencing of data collected during scouting. A notebook computer retrieved locations from a GPS unit and displayed and stored data in Excel. A headset microphone with a single earphone collected spoken input while allowing the operator to hear outside sounds. One-, two-, or three-word commands activated appropriate VBA macros. Four speech recognition products were chosen based on hardware requirements and ability to add new terms. After training, speech recognition accuracy was 100% for Kurzweil VoicePlus and Verbex Listen for the 132 vocabulary words tested, during tests walking outdoors or driving an ATV. Scouting tests were performed by carrying the system in a backpack while walking in soybean fields. The system recorded a point or a series of points with each utterance. Boundaries of points showed problem areas in the field and single points marked rocks and field corners. Data were displayed as an Excel chart to show a real-time map as data were collected. The information was later displayed in a GIS over remote sensed field images. Field corners and areas of poor stand matched, with voice data explaining anomalies in the image. The system was tested during soybean harvest by using voice to locate weed patches. A harvester operator with little computer experience marked points by voice when the harvester entered and exited weed patches or areas with poor crop stand. The operator found the

  20. To the problem of topological optimization of data processing and transmission networks in development of the automatic control system ''Atom''

    International Nuclear Information System (INIS)

    Some optimization problems occurring in developing the automatic control system (ASC) of a commercial amalgamation (ACS-ATOM), assessments of economically optimal structure of location of computation centres and means of data transmission in particular are considered

  1. Hardware enhancements to the laboratory data collection system

    International Nuclear Information System (INIS)

    The core of the laboratory data collection system consists of 12 commercial Tracor Northern 200 MHz ADC's interfaced to a DEC PDP 11/60 computer. The operation of this system has been described in detail in earlier annual reports but is summarized. Enhancement of this data collection hardware consisted of redesigning and constructing a new chassis for the ADC computer interface. This was done to: (1) increase reliability of the system, (2) incorporate all the changes that have occurred over the past four years, (3) provide a system that would allow in-rack diagnosis and board swap repairs, and (4) provide extra room and connector and mode switch capacity for future expansion

  2. Dynamic Data Driven Applications Systems (DDDAS) modeling for automatic target recognition

    Science.gov (United States)

    Blasch, Erik; Seetharaman, Guna; Darema, Frederica

    2013-05-01

    The Dynamic Data Driven Applications System (DDDAS) concept uses applications modeling, mathematical algorithms, and measurement systems to work with dynamic systems. A dynamic systems such as Automatic Target Recognition (ATR) is subject to sensor, target, and the environment variations over space and time. We use the DDDAS concept to develop an ATR methodology for multiscale-multimodal analysis that seeks to integrated sensing, processing, and exploitation. In the analysis, we use computer vision techniques to explore the capabilities and analogies that DDDAS has with information fusion. The key attribute of coordination is the use of sensor management as a data driven techniques to improve performance. In addition, DDDAS supports the need for modeling from which uncertainty and variations are used within the dynamic models for advanced performance. As an example, we use a Wide-Area Motion Imagery (WAMI) application to draw parallels and contrasts between ATR and DDDAS systems that warrants an integrated perspective. This elementary work is aimed at triggering a sequence of deeper insightful research towards exploiting sparsely sampled piecewise dense WAMI measurements - an application where the challenges of big-data with regards to mathematical fusion relationships and high-performance computations remain significant and will persist. Dynamic data-driven adaptive computations are required to effectively handle the challenges with exponentially increasing data volume for advanced information fusion systems solutions such as simultaneous target tracking and ATR.

  3. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  4. The data collection system for failure/maintenance at the Tritium Systems Test Assembly

    International Nuclear Information System (INIS)

    A data collection system for obtaining information which can be used to help determine the reliability and vailability of future fusion power plants has been installed at the Los Alamos National Laboratory's Tritium Systems Test Assembly (TSTA). Failure and maintenance data on components of TSTA's tritium systems have been collected since 1984. The focus of the data collection has been TSTA's Tritium Waste Tratment System (TWT), which has maintained high availability since it became operation in 1982. Data collection is still in progress and a total of 291 failure reports are in the data collection system at this time, 47 of which are from the TWT. 6 refs., 2 figs., 2 tabs

  5. Development of teacher schedule automatic collection system based on Visual Basic%基于Visual Basic的教师课表自动汇总系统开发

    Institute of Scientific and Technical Information of China (English)

    刘信香

    2012-01-01

    In this paper, according to the need of practice, a set of teachers' schedules automatic collection system is developed based on Visual Basic program. The sehedule data can be read automatically, and the total schedule file can be generated automatically after the data be collected in this system. This system can replace the artificial tedious duplication of effort, with high efficiency, and convenient.%根据实际工作的需要,开发了一套基于Visual Basic程序的教师课表自动汇总系统。该系统可自动读取课表数据,并将读取的数据汇总后自动生成总课表文件,可代替人工的繁琐重复劳动,具有效率高、方便快捷的特点。

  6. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. ((orig.))

  7. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  8. INfluence of vinasse on water movement in soil, using automatic acquisition and handling data system

    International Nuclear Information System (INIS)

    The vinasse, by-product of ethylic alcohol industry from the sugar cane juice or molasses yeast fermentation, has been incorporated in the soil as fertilizer, due to the its hight organic matter (2-6%), potassium and sulphate (0,1-0,5%) and other nutrient contents. By employing monoenergetic gamma-ray beam attenuation technique (241Am; 59,5 keV; 100 mCi) the influence of vinasse on the water movement in the soil was studied. For this, an automatic acquisition and handling data system was used, based in multichannel analyser, multi-scaling mode operated, coupled to a personal microcomputer and plotter. Despite the small depth studied (6 cm), it was observed that vinasse decreases the water infiltration velocity in the soil. (Author)

  9. Automatic Classification of the Vestibulo-Ocular Reflex Nystagmus: Integration of Data Clustering and System Identification.

    Science.gov (United States)

    Ranjbaran, Mina; Smith, Heather L H; Galiana, Henrietta L

    2016-04-01

    The vestibulo-ocular reflex (VOR) plays an important role in our daily activities by enabling us to fixate on objects during head movements. Modeling and identification of the VOR improves our insight into the system behavior and improves diagnosis of various disorders. However, the switching nature of eye movements (nystagmus), including the VOR, makes dynamic analysis challenging. The first step in such analysis is to segment data into its subsystem responses (here slow and fast segment intervals). Misclassification of segments results in biased analysis of the system of interest. Here, we develop a novel three-step algorithm to classify the VOR data into slow and fast intervals automatically. The proposed algorithm is initialized using a K-means clustering method. The initial classification is then refined using system identification approaches and prediction error statistics. The performance of the algorithm is evaluated on simulated and experimental data. It is shown that the new algorithm performance is much improved over the previous methods, in terms of higher specificity. PMID:26357393

  10. The Processing of Image Data Collected by Light UAV Systems for GIS Data Capture and Updating

    OpenAIRE

    N. Yastikli; I. Bagci; C. Beser

    2013-01-01

    The collection and updating of 3D data is the one of the important steps for GIS applications which require fast and efficient data collection methods. The photogrammetry has been used for many years as a data collection method for GIS application in larger areas. The Unmanned Aerial Vehicles (UAV) Systems gained increasing attraction in geosciences for cost effective data capture and updating at high spatial and temporal resolution during the last years. These autonomously flying UA...

  11. Parallel Plate System for Collecting Data Used to Determine Viscosity

    Science.gov (United States)

    Kaukler, William (Inventor); Ethridge, Edwin C. (Inventor)

    2013-01-01

    A parallel-plate system collects data used to determine viscosity. A first plate is coupled to a translator so that the first plate can be moved along a first direction. A second plate has a pendulum device coupled thereto such that the second plate is suspended above and parallel to the first plate. The pendulum device constrains movement of the second plate to a second direction that is aligned with the first direction and is substantially parallel thereto. A force measuring device is coupled to the second plate for measuring force along the second direction caused by movement of the second plate.

  12. An Automatic Data-Logging System for Meteorological Studies in Reactor Environments

    International Nuclear Information System (INIS)

    An automatic data-logging system has been designed for meteorological studies for the Tarapur power reactor. The system is designed to log data from 256 sensors divided into two groups of 128 each. The outputs from the sensors in analog form, varying from 0 to 100 mV can be scanned sequentially. The scanning unit, used for time multiplexing, consists of a bank of 256 pairs of reed relays. It connects sequentially the outputs from the two groups of sensors to two chopper-modulated d.c. amplifiers. The output from the chopper-modulated d.c. amplifier varies from -4 to -10 V. A linear and highly stable A-D converter connected alternately to the chopper-modulated d. c. amplifiers digitizes the amplified outputs. The digitized data are stored in a ferrite core memory with a capacity of 256 5-digit words. The data are handled in the binary-coded decimal form. Each memory location corresponds to a particular input sensor. When sensor Mi is selected by the scanning unit, its digitized output is added to the previously stored data in the Mith memory location and the result is stored back in the same location. The Mi + 1 sensor is next selected. The scanning unit selects all the sensors every second. At the end of 10 min the memory locations contain the averages of outputs of all the sensors. This data' is punched on a paper tape in the next 2 min. The sensors are scanned again after clearing the memory. The logical operations are controlled with a 100-kc/s crystal controlled time clock. The data are fed to a digital computer for analysis. (author)

  13. CDC data systems collecting behavioral data on HIV counseling and testing.

    OpenAIRE

    Anderson, J E

    1996-01-01

    This paper describes two systems, the HIV Counseling and Testing Data System and the National Health Interview Survey, AIDS Knowledge and Attitudes Supplement, that collect behavioral information on HIV counseling and testing in the United States. Together these data sources provide valuable information for planning and evaluating counseling and testing programs. While these two systems are not designed primarily for behavioral research, they both collect behavioral data, including the behavi...

  14. Cyber security and data collection approaches for smartphone sensor systems

    Science.gov (United States)

    Turner, Hamilton; White, Jules

    2012-06-01

    In recent years the ubiquity and resources provided by smartphone devices have encouraged scientists to explore using these devices as remote sensing nodes. In addition, the United States Department of Defense has stated a mission of increasing persistent intelligence, surveillance, and reconnaissance capabilities or U.S. units. This paper presents a method of enabling large-scale, long-term smartphone-powered data collection. Key solutions discussed include the ability to directly allow domain experts to define and refine smartphone applications for data collection, technical advancements that allow rapid dissemination of a smartphone data collection application, and an algorithm for preserving the locational privacy of participating users.

  15. An automatic locating and data logging system for radiological walkover surveys

    International Nuclear Information System (INIS)

    Oak Ridge National Laboratory has developed an Ultrasonic Ranging and Data System (USRADS) to track a radiation surveyor in the field, to log his instrument's reading automatically, and to provide tabular and graphical data display in the field or in the office. Once each second, USRADS computes the position of the radiation surveyor by using the time-of-flight of an ultrasonic chirp, emitted by a transducer carried in a backpack, to stationary receivers deployed in the field. When the ultrasonic transducer is pulsed, a microprocessor in the backpack radios the start time and survey instrument's reading to the master receiver at the base station (a van or truck). A portable computer connected to the master receiver plots the surveyor's position on the display, and stores his position and instrument reading. The CHEMRAD Corporation has just completed a survey of the ORNL main plant area using two radiation survey instruments simultaneously: a ratemeter connected to a NaI crystal that is swung in a arc near the ground, to look for surface contamination; and a small pressurized ionization chamber (PIC), attached to the backpack frame at a height of 3 ft, to measure the exposure rate. 3 refs., 5 figs

  16. GOES data-collection system instrumentation, installation, and maintenance manual

    Science.gov (United States)

    Blee, J.W.; Herlong, H.E.; Kaufmann, C.D., Jr.; Hardee, J.H.; Field, M.L.; Middelburg, R.F.

    1986-01-01

    The purpose of the manual is to describe the installation, operation, and maintenance of Geostationary Operational Environmental Satellite (GOES) data collection platforms (DCP's) and associated equipment. This manual is not a substitute for DCP manufacturers ' manuals but is additional material that describes the application of data-collection platforms in the Water Resources Division. Power supplies, encoders, antennas, Mini Monitors, voltage analog devices, and the installation of these at streamflow-gaging stations are discussed in detail. (USGS)

  17. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    Science.gov (United States)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  18. The Processing of Image Data Collected by Light UAV Systems for GIS Data Capture and Updating

    Science.gov (United States)

    Yastikli, N.; Bagci, I.; Beser, C.

    2013-10-01

    The collection and updating of 3D data is the one of the important steps for GIS applications which require fast and efficient data collection methods. The photogrammetry has been used for many years as a data collection method for GIS application in larger areas. The Unmanned Aerial Vehicles (UAV) Systems gained increasing attraction in geosciences for cost effective data capture and updating at high spatial and temporal resolution during the last years. These autonomously flying UAV systems are usually equipped with different sensors such as GPS receiver, microcomputers, gyroscopes and miniaturized sensor systems for navigation, positioning, and mapping purposes. The UAV systems can be used for data collection for digital elevation model DEM and orthoimages generation in GIS application at small areas. In this study, data collection and processing by light UAV system will be evaluated for GIS data capture and updating for small areas where not feasible for traditional photogrammetry. The main aim of this study is to design the low cost light UAV system for GIS data capture and update. The investigation was based on the aerial images which recorded during the flights performed with UAV system over the test site in Davutpasa Campus of Yildiz Technical University, Istanbul. The quality of generated DEM and ortho-images from UAV flights was discussed for GIS data capture and updating for small areas.

  19. AUTORAMP - an automatic unit for unattended aerosol collection, gamma-ray analysis and data transmission from remote locations

    International Nuclear Information System (INIS)

    The Environmental Measurements Laboratory has designed, developed and field tested a fully automated and completely unattended multisample, surface-air monitoring system. This system, AUTORAMP, collects large-volume aerosol samples on discrete pleated cartridge filters, measures these samples in a near ideal geometry with a refrigerator-cooled Germanium gamma-ray detector, and immediately transmits the resulting spectra through a telephone/modem connection or a satellite link. Using a sample tray loaded with 31 filters, the system will allow for more than six months of unattended operation with weekly sampling, or one month of daily sampling. Remote control of all operating functions is possible through the communications link. For a 24-h collection, at 12,000 m3 d-1 and a 18-h gamma-ray count, this system can detect as little as 2.7 μBq m-3 of the short-lived 140Ba and 5.4 μBq m-3 of 137Cs

  20. A preliminary study into performing routine tube output and automatic exposure control quality assurance using radiology information system data

    International Nuclear Information System (INIS)

    Data are currently being collected from hospital radiology information systems in the North West of the UK for the purposes of both clinical audit and patient dose audit. Could these data also be used to satisfy quality assurance (QA) requirements according to UK guidance? From 2008 to 2009, 731 653 records were submitted from 8 hospitals from the North West England. For automatic exposure control QA, the protocol from Inst. of Physics and Engineering in Medicine (IPEM) report 91 recommends that milli amperes per second can be monitored for repeatability and reproducibility using a suitable phantom, at 70-81 kV. Abdomen AP and chest PA examinations were analysed to find the most common kilo voltage used with these records then used to plot average monthly milli amperes per second with time. IPEM report 91 also recommends that a range of commonly used clinical settings is used to check output reproducibility and repeatability. For each tube, the dose area product values were plotted over time for two most common exposure factor sets. Results show that it is possible to do performance checks of AEC systems; however more work is required to be able to monitor tube output performance. Procedurally, the management system requires work and the benefits to the workflow would need to be demonstrated. (authors)

  1. An automatic detection system for buried explosive hazards in FL-LWIR and FL-GPR data

    Science.gov (United States)

    Stone, K.; Keller, J. M.; Anderson, D. T.; Barclay, D. B.

    2012-06-01

    Improvements to an automatic detection system for locating buried explosive hazards in forward-looking longwave infrared (FL-LWIR) imagery, as well as the system's application to detection in confidence maps and forwardlooking ground penetrating radar (FL-GPR) data, are discussed. The detection system, described in previous work, utilizes an ensemble of trainable size-contrast filters and the mean-shift algorithm in Universal Transverse Mercator (UTM) coordinates. Improvements of the raw detection algorithm include weighted mean-shift within the individual size-contrast filters and a secondary classification step which exacts cell structured image space features, including local binary patterns (LBP), histogram of oriented gradients (HOG), edge histogram descriptor (EHD), and maximally stable extremal regions (MSER) segmentation based shape information, from one or more looks and classifies the resulting feature vector using a support vector machine (SVM). FL-LWIR specific improvements include elimination of the need for multiple models due to diurnal temperature variation. The improved algorithm is assessed on FL-LWIR and FL-GPR data from recent collections at a US Army test site.

  2. Data mining process automatization of air pollution data by the LISp-Miner system

    OpenAIRE

    Ochodnická, Zuzana

    2014-01-01

    This thesis is focused on the area of automated data mining. The aim of this thesis is a description of the area of automated data mining, creation of a design of an automated data mining tasks creation process for verification of set domain knowledge and new knowledge search, and also an implementation of verification of set domain knowledge of attribute dependency type influence with search space adjustments. The implementation language is the LMCL language that enables usage of the LISp-Mi...

  3. Collection and evaluation of salt mixing data with the real time data acquisition system

    International Nuclear Information System (INIS)

    A minicomputer based real time data acquisition system was designed and built to facilitate data collection during salt mixing tests in mock ups of LMFBR rod bundles. The system represents an expansion of data collection capabilities over previous equipment. It performs steady state and transient monitoring and recording of up to 512 individual electrical resistance probes. Extensive real time software was written to govern all phases of the data collection procedure, including probe definition, probe calibration, salt mixing test data acquisition and storage, and data editing. Offline software was also written to permit data examination and reduction to dimensionless salt concentration maps. Finally, the computer program SUPERENERGY was modified to permit rapid extraction of parameters from dimensionless salt concentration maps. The document describes the computer system, and includes circuit diagrams of all custom built components. It also includes descriptions and listings of all software written, as well as extensive user instructions

  4. Collection and evaluation of salt mixing data with the real time data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Glazer, S.; Chiu, C.; Todreas, N.E.

    1977-09-01

    A minicomputer based real time data acquisition system was designed and built to facilitate data collection during salt mixing tests in mock ups of LMFBR rod bundles. The system represents an expansion of data collection capabilities over previous equipment. It performs steady state and transient monitoring and recording of up to 512 individual electrical resistance probes. Extensive real time software was written to govern all phases of the data collection procedure, including probe definition, probe calibration, salt mixing test data acquisition and storage, and data editing. Offline software was also written to permit data examination and reduction to dimensionless salt concentration maps. Finally, the computer program SUPERENERGY was modified to permit rapid extraction of parameters from dimensionless salt concentration maps. The document describes the computer system, and includes circuit diagrams of all custom built components. It also includes descriptions and listings of all software written, as well as extensive user instructions.

  5. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables

    OpenAIRE

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-01-01

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to pr...

  6. JAPS: an automatic parallelizing system based on JAVA

    Institute of Scientific and Technical Information of China (English)

    杜建成; 陈道蓄; 谢立

    1999-01-01

    JAPS is an automatic parallelizing system based on JAVA running on NOW. It implements the automatic process from dependence analysis to parallel execution. The current version of JAPS can exploit functional parallelism and the detection of data parallelism will be incorporated in the new version, which is underway. The framework and key techniques of JAPS are presented. Specific topics discussed are task partitioning, summary information collection, data dependence analysis, pre-scheduling and dynamic scheduling, etc.

  7. Modeling and enhanced sampling of molecular systems with smooth and nonlinear data-driven collective variables

    Science.gov (United States)

    Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino

    2013-12-01

    Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.

  8. Towards a knowledge-based system to assist the Brazilian data-collecting system operation

    Science.gov (United States)

    Rodrigues, Valter; Simoni, P. O.; Oliveira, P. P. B.; Oliveira, C. A.; Nogueira, C. A. M.

    1988-01-01

    A study is reported which was carried out to show how a knowledge-based approach would lead to a flexible tool to assist the operation task in a satellite-based environmental data collection system. Some characteristics of a hypothesized system comprised of a satellite and a network of Interrogable Data Collecting Platforms (IDCPs) are pointed out. The Knowledge-Based Planning Assistant System (KBPAS) and some aspects about how knowledge is organized in the IDCP's domain are briefly described.

  9. Data Collection and Cost Modeling for Library Circulation Systems.

    Science.gov (United States)

    Bourne, Charles P.

    The objectives of the study leading to this report were to review, analyze and summarize published library cost data; and to develop a cost model and a methodology for reporting data in a more consistent and useful way. The cost model and reporting procedure were developed and tested on the circulation system of three libraries: a large university…

  10. Automatic Speaker Recognition System

    Directory of Open Access Journals (Sweden)

    Parul,R. B. Dubey

    2012-12-01

    Full Text Available Spoken language is used by human to convey many types of information. Primarily, speech convey message via words. Owing to advanced speech technologies, people's interactions with remote machines, such as phone banking, internet browsing, and secured information retrieval by voice, is becoming popular today. Speaker verification and speaker identification are important for authentication and verification in security purpose. Speaker identification methods can be divided into text independent and text-dependent. Speaker recognition is the process of automatically recognizing speaker voice on the basis of individual information included in the input speech waves. It consists of comparing a speech signal from an unknown speaker to a set of stored data of known speakers. This process recognizes who has spoken by matching input signal with pre- stored samples. The work is focussed to improve the performance of the speaker verification under noisy conditions.

  11. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS). In the...... second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable of...... identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values...

  12. Automatic derivation of earth observation products from satellite data within the Siberian Earth System Science Cluster (SIB-ESS-C)

    Science.gov (United States)

    Eberle, J.; Schmullius, C. C.

    2011-12-01

    The Siberian Earth System Science Cluster (SIB-ESS-C) established at the University of Jena (Germany) is a spatial data infrastructure implementing standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO) aimed at providing researchers with focus on Siberia with the technical means for data discovery, data access, data publication and data analysis in work with earth observation data. At the current development stage the SIB-ESS-C system comprises a federated metadata catalogue accessible through the SIB-ESS-C Web Portal or from any OGC-CSW compliant client. The Web Portal also contains a simple map-like visualization component which is currently being extended to a comprehensive visualization and analysis tool. The visualization component enables users to overlay different dataset found during a catalogue search. All data products are accessible as Web Mapping, Web Feature or Web Coverage Services allowing users to directly incorporate the data into their application. New developments aims on automatic registration and processing of raw earth observation data to derive permanently earth observation products. A data registry system within a whole process system including process chains to implement algorithms is currently designed. This will be extended with a system to process these incoming data automatically and permanently, depending on registered algorithms. Algorithms should know which input data is necessary and registered data should know which algorithms could be executed on it. This paper describes current developments as well as future ideas to build up a usefull and userfriendly access to satellite data, algorithms and therefrom derived products with state of the art web technologies and standards of the OGC.

  13. Methods and procedures for automatic collection and management of data acquired from on-the-go sensors with application to on-the-go soil sensors

    OpenAIRE

    Peets, Sven; Mouazen, Abdul Mounem; Blackburn, Kim; Kuang, Boyan Y.; Wiebensohn, Jens

    2012-01-01

    Sensors for on-the-go collection of data on soil and crop have become essential for successful implementation of precision agriculture. This paper analyses the potentials and develops general procedures for onthe- go data acquisition of soil sensors. The methods and procedures used to manage data with respect to a farm management information system (FMIS) are described. The current data communication standard for tractors and machinery in agriculture is ISO 11783, which is r...

  14. Designing a Method for AN Automatic Earthquake Intensities Calculation System Based on Data Mining and On-Line Polls

    Science.gov (United States)

    Liendo Sanchez, A. K.; Rojas, R.

    2013-05-01

    Seismic intensities can be calculated using the Modified Mercalli Intensity (MMI) scale or the European Macroseismic Scale (EMS-98), among others, which are based on a serie of qualitative aspects related to a group of subjective factors that describe human perception, effects on nature or objects and structural damage due to the occurrence of an earthquake. On-line polls allow experts to get an overview of the consequences of an earthquake, without going to the locations affected. However, this could be a hard work if the polls are not properly automated. Taking into account that the answers given to these polls are subjective and there is a number of them that have already been classified for some past earthquakes, it is possible to use data mining techniques in order to automate this process and to obtain preliminary results based on the on-line polls. In order to achieve these goal, a predictive model has been used, using a classifier based on a supervised learning techniques such as decision tree algorithm and a group of polls based on the MMI and EMS-98 scales. It summarized the most important questions of the poll, and recursive divides the instance space corresponding to each question (nodes), while each node splits the space depending on the possible answers. Its implementation was done with Weka, a collection of machine learning algorithms for data mining tasks, using the J48 algorithm which is an implementation of the C4.5 algorithm for decision tree models. By doing this, it was possible to obtain a preliminary model able to identify up to 4 different seismic intensities with 73% correctly classified polls. The error obtained is rather high, therefore, we will update the on-line poll in order to improve the results, based on just one scale, for instance the MMI. Besides, the integration of automatic seismic intensities methodology with a low error probability and a basic georeferencing system, will allow to generate preliminary isoseismal maps

  15. AUTOMATIC DESIGNING OF POWER SUPPLY SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. I. Kirspou

    2016-01-01

    Full Text Available Development of automatic designing system for power supply of industrial enterprises is considered in the paper. Its complete structure and principle of operation are determined and established. Modern graphical interface and data scheme are developed, software is completely realized. Methodology and software correspond to the requirements of the up-to-date designing, describe a general algorithm of program process and also reveals properties of automatic designing system objects. Automatic designing system is based on module principle while using object-orientated programming. Automatic designing system makes it possible to carry out consistently designing calculations of power supply system and select the required equipment with subsequent output of all calculations in the form of explanatory note. Automatic designing system can be applied by designing organizations under conditions of actual designing.

  16. Definition of an automatic information retrieval system independent from the data base used

    International Nuclear Information System (INIS)

    A bibliographic information retrieval system using data stored at the standardized interchange format ISO 2709 or ANSI Z39.2, is specified. A set of comands for interchange format manipulation wich allows the data access at the logical level, achieving the data independence, are used. A data base description language, a storage structure and data base manipulation comands are specified, using retrieval techniques which consider the applications needs. (Author)

  17. The collection and analysis of transient test data using the mobile instrumentation data acquisition system (MIDAS)

    Energy Technology Data Exchange (ETDEWEB)

    Uncapher, W.L.; Arviso, M.

    1995-12-31

    Packages designed to transport radioactive materials are required to survive exposure to environments defined in Code of Federal Regulations. Cask designers can investigate package designs through structural and thermal testing of full-scale packages, components, or representative models. The acquisition of reliable response data from instrumentation measurement devices is an essential part of this testing activity. Sandia National Laboratories, under the sponsorship of the US Department of Energy (DOE), has developed the Mobile Instrumentation Data Acquisition System (MIDAS) dedicated to the collection and processing of structural and thermal data from regulatory tests.

  18. A portable wireless data collection system by using optical power supply and photo-communication

    International Nuclear Information System (INIS)

    For aiming at effective application to annual change management of patrolling inspection data and so forth, a portable wireless measuring and data collection device measurable to vibration, temperature and so forth automatically and for short time under patrolling of inspectors and collectable on sensor signals at many places, to collect field data as electronized data. This device was comprised of a sensor head to mount on an object apparatus to transmit sensor signals and a sensor terminal brought by an inspector and with functions to receive and memory a signal from the sensor head. It had a characteristics capable of wireless data collection using optical power supply and photo-communication where all of power supply to sensor head and transmission and receiving of data were conducted optically. As a result, some characteristics could be realized such as perfect realization of wireless data collection and reduction of maintenance burden without its need on installation of source, signal wire, and so forth, possibility to collect data for short time from distant place, and possibility to conduct high order treatment due to obtaining native waveform signal but no conventional numerical data, and possibility of development on apparatus diagnosis such as detection of abnormal sign and others. (G.K.)

  19. Automatic remote communication system

    International Nuclear Information System (INIS)

    The Upgraded RECOVER (Remote Continual Verification) system is a communication system for remote continual verification of security and safeguards status of nuclear material in principal nuclear facilities. The system is composed of a command center and facility sub-systems. A command center is a mini-computer system to process C/S (Containment and Surveillance) status data. Facility sub-systems consists of OSM (On-site Multiplexer), MU (Monitoring Unit) and C/S sensor. The system uses public telephone network for communication between a command center and facility sub-systems, and it encrypts communication data to prevent falsification and wiretapping by unauthorized persons. This system inherits the design principle of RECOVER system that was tested by IAEA before. We upgraded and expanded its capabilities more than those of RECOVER. The development of this system began in 1983, and it finished in 1987. Performance tests of the system were carried out since 1987. It showed a farely good result with some indications which should need further improvements. The Upgraded RECOVER system provides timely information about the status of C/S systems, which could contribute to the reduction of inspection effort and the improvement of cost performance. (author)

  20. Feasibility study for adding a demand failure data collection system to the Nuclear Plant Reliability Data System. Final report

    International Nuclear Information System (INIS)

    Southwest Research Institute (SwRI) is pleased to submit to Sandia National Laboratories this technical report as fulfillment of Task 5 of the proposal entitled A Feasibility Study for Adding a Duty Cycle Data Collection System to the Nuclear Plant Reliability Data System. The purpose of this report is to summarize the work as delineated in the proposal tasks and to recommend follow-on activities. Technical support for this work was provided by Duke Power Company (Duke), subcontractor to SwRI. The four tasks to be performed in conjunction with the Duty Cycle Data Collection Study (renamed in this report Demand Data Collection) were: define component population and measurable parameters; develop data collection and assessment methodologies; assess the impact on utilities; and assess the impact on NPRDS

  1. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  2. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive manua

  6. Data Analysis of Metering System and Automatic Regulation of Heat Supply to Five- and More Storey Housing Buildings

    Directory of Open Access Journals (Sweden)

    F. R. Latypov

    2014-06-01

    Full Text Available The paper considers regulation of heat supply to housing buildings using automatic regulators. Integral index of automatic regulator action on the heat-supply system of multi-storey housing building is proportional to a thermal criterion of Pomerantsev similarity.

  7. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850) or...... otherwise structured databases. More measurements and data do not automatically improve decisions, but there is an opportunity to capitalize on this information for decision support. With suitable reasoning strategies data can be contextualized and decision-relevant events can be promoted and identified....... This paper presents an approach to link available structured power system data directly to a functional representation suitable for diagnostic reasoning. The translation method is applied to test cases also illustrating decision support....

  8. Real time Aanderaa current meter data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G.

    Aanderaa current meters are widely used for recording the current speed and such other 4 parameters by deploying them over extended period of time. Normally data are recorded on magnetic tape and after recovery of current meters, data are read...

  9. Progress on Statistical Learning Systems as Data Mining Tools for the Creation of Automatic Databases in Fusion Environments

    International Nuclear Information System (INIS)

    Fusion devices produce tens of thousands of discharges but only a very limited part of the collected information is analysed. The analysis of physical events requires their identification and temporal location and the generation of specialized databases in relation to these time instants. The automatic determination of precise time instants in which events happen and the automatic search for potential relevant time intervals could be made thanks to classification techniques and regression techniques. Classification and regression techniques have been used for the automatic creation of specialized databases for JET and have allowed the automatic determination of disruptive / non-disruptive character of discharges. The validation of the recognition method has been carried out with 4400 JET discharges and the global success rate has been 99.02 per cent

  10. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  11. Computer Science technology applied to data collection and data managament

    OpenAIRE

    Guzmán, O.; González, M.; Carrasco, J. (Juan); Bernal, C.; Vera, C.; Troncoso, M.

    2009-01-01

    IFOP, as non profit marine research institute has the mission to provide to the Under Secretariat of Fisheries in Chile, the technical information and scientific basis for the regulation of Chilean Fisheries. For this purpose it has 150 Scientific Observers distributed throughout the Chilean coast. With the intention to improve the process of data production, a group of scientists has developed a new computer science system for data collection, data management, and automatic publication of f...

  12. 40 CFR 141.533 - What data must my system collect to calculate a disinfection profile?

    Science.gov (United States)

    2010-07-01

    ... AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Enhanced... 40 Protection of Environment 22 2010-07-01 2010-07-01 false What data must my system collect to... must my system collect to calculate a disinfection profile? Your system must monitor the...

  13. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Directory of Open Access Journals (Sweden)

    T. A. Boden

    2013-06-01

    Full Text Available The Carbon Dioxide Information Analysis Center (CDIAC at Oak Ridge National Laboratory (ORNL, USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent

  14. Special Data Collection System (SDCS) NTS Event 'Bulkhead', 27 April 1977. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Naylor, G.R.; Dawkins, M.S.

    1978-12-18

    This event report contains seismic data from the Special Data Collection System (SDCS), and other sources for the 'Bulkhead' event. The report also contains epicenter information from seismic observations.

  15. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  16. XML-Based Automatic Test Data Generation

    OpenAIRE

    Halil Ibrahim Bulbul; Turgut Bakir

    2012-01-01

    Software engineering aims at increasing quality and reliability while decreasing the cost of the software. Testing is one of the most time-consuming phases of the software development lifecycle. Improvement in software testing results in decrease in cost and increase in quality of the software. Automation in software testing is one of the most popular ways of software cost reduction and reliability improvement. In our work we propose a system called XML-based automatic test data generation th...

  17. Mobile robot teleoperation system for plant inspection based on collecting and utilizing environment data

    International Nuclear Information System (INIS)

    This paper describes about development of a mobile robot teleoperation system for plant inspection. In our system, the robot is an agent for collecting the environment data and is also teleoperated by the operator utilizing such accumulated environment data which is displayed on the operation interface. The robot equips many sensors for detecting the state of the robot and the environment. Such redundant sensory system can be also utilized to collect the working environment data on-site while the robot is patrolling. Here, proposed system introduces the framework of collecting and utilizing environment data for adaptive plant inspection using the teleoperated robot. A view simulator is primarily aiming to facilitate evaluation of the visual sensors and algorithms and is also extended as the Environment Server, which is the core technology of the digital maintenance field for the plant inspection. In order to construct detailed seamless digital maintenance field mobile robotic technology is utilized to supply environment data to the server. The sensory system on the robot collect the environment data on-site and such collected data is uploaded to the Environment Server for compiling accurate digital environment data base. The robot operator also can utilize accumulated environment data by referring to the Environment Server. In this paper, we explain the concept of our teleoperation system based on collecting and utilizing environment data. Using developed system, inspection patrol experiments were attempted in the plant mock-up. Experimental results are shown by using an omnidirectional mobile robot with sensory system and the Environment Server. (author)

  18. Automatically controlled training systems

    International Nuclear Information System (INIS)

    This paper reports that the computer system for NPP personnel training was developed for training centers in the Soviet Union. The system should be considered as the first step in training, taking into account that further steps are to be devoted to part-task and full scope simulator training. The training room consists of 8-12 IBM PC/AT personal computers combined into a network. A trainee accesses the system in a dialor manner. Software enables the instructor to determine the trainee's progress in different subjects of the program. The quality of any trainee preparedness may be evaluated by Knowledge Control operation. Simplified dynamic models are adopted for separate areas of the program. For example, the system of neutron flux monitoring has a dedicated model. Currently, training, requalification and support of professional qualifications of nuclear power plant operators is being emphasized. A significant number of emergency situations during work are occurring due to operator errors. Based on data from September-October 1989, more than half of all unplanned drops in power and stoppages of power plants were due to operator error. As a comparison, problems due to equipment malfunction accounted for no more than a third of the total. The role of personnel, especially of the operators, is significant during normal operations, since energy production costs as well as losses are influenced by the capability of the staff. These facts all point to the importance of quality training of personnel

  19. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  20. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  1. Data acquisition system for TRIGA Mark I nuclear reactor and a proposal for its automatic operation

    International Nuclear Information System (INIS)

    The TRIGA IPR-R1 Nuclear Research Reactor, located at the Nuclear Technology Development Center (CDTN/CNEN) in Belo Horizonte, Brazil, is being operated since 44 years ago. During these years the main operational parameters were monitored by analog recorders and counters located in the reactor control console. The most important operational parameters and data in the reactor logbook were registered by the reactor operators. This process is quite useful, but it can involve some human errors. It is also impossible for the operators to take notes of all variables involving the process mainly during fast power transients operations. A PC-based Data Acquisition was developed for the reactor that allows on line monitoring, through graphic interfaces, and shows operational parameters evolution to the operators. Some parameters that never were measured on line, like the thermal power and the coolant flow rate at the primary loop, are monitored now in the computer video monitor. The developed system allows measure out all parameters in a frequency up to 1 kHz. These data is also recorded in text files available for consults and analysis. (author)

  2. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user. PMID:24783795

  3. Curating Virtual Data Collections

    Science.gov (United States)

    Lynnes, C.; Ramapriyan, H.; Leon, A.; Tsontos, V. M.; Liu, Z.; Shie, C. L.

    2015-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) contains a rich set of datasets and related services throughout its many elements. As a result, locating all the EOSDIS data and related resources relevant to particular science theme can be daunting. This is largely because EOSDIS data's organizing principle is affected more by the way they are produced than around the expected end use.Virtual collections oriented around science themes can overcome this by presenting collections of data and related resources that are organized around the user's interest, not around the way the data were produced. Science themes can be: Specific applications (uses) of the data, e.g., landslide prediction Geophysical events (e.g., Hurricane Sandy) A specific science research problem Virtual collections consist of annotated web addresses (URLs) that point to data and related resource addresses, thus avoiding the need to copy all of the relevant data to a single place. These URL addresses can be consumed by a variety of clients, ranging from basic URL downloaders (wget, curl) and web browsers to sophisticated data analysis programs such as the Integrated Data Viewer. Eligible resources include anything accessible via URL: data files: data file URLs data subsets: OPeNDAP, webification or Web Coverage Service URLs data visualizations: Web Map Service data search results: OpenSearch Atom response custom analysis workflows: e.g., Giovanni analysis URL

  4. Overview of the SOFIA Data Processing System: A generalized system for manual and automatic data processing at the SOFIA Science Center

    Science.gov (United States)

    Shuping, Ralph; Krzaczek, Robert; Vacca, William D.; Charcos-Llorens, Miguel; Reach, William T.; Alles, Rosemary; Clarke, Melanie; Melchiorri, Riccardo; Radomski, James T.; Shenoy, Sachindev S.; Sandel, David; Omelian, Eric

    2015-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5-meter telescope mounted in the aft section of a Boeing 747SP aircraft. SOFIA is designed to execute observations at altitudes between 37,000 and 45,00 feet, above 99% of atmospheric water vapor. During routine operations, several instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR. Raw data obtained in-flight require a significant amount of processing to correct for background emission (from both the telescope and atmosphere), remove instrumental artifacts, correct for atmospheric absorption, and apply both wavelength and flux calibration. In general, this processing is highly specific to the instrument and telescope. Once this post-processing is complete, the data can be used in scientific analysis and publications. In order to maximize the scientific output of the observatory, the SOFIA Science Center must provide these post-processed data sets to Guest Investigators in a timely manner. To meet this requirement, we have designed and built the SOFIA Data Processing System (DPS): an in-house set of tools and services that can be used in both automatic ("pipeline") and manual modes to process data from a variety of instruments. In this poster paper, we present an overview of the DPS concepts and architecture, as well as operational results from the first two SOFIA observing cycles (2013--2014).

  5. Automatic subject classification of textual documents using limited or no training data

    OpenAIRE

    Joorabchi, Arash

    2010-01-01

    With the explosive growth in the number of electronic documents available on the internet, intranets, and digital libraries, there is a growing need for automatic systems capable of indexing and organising such large volumes of data more that ever. Automatic Text Classification (ATC) has become one of the principal means for enhancing the performance of information retrieval systems and organising digital libraries and other textual collections. Within this context, the use of ...

  6. Progress on Statistical Learning Systems as Data Mining Tools for the Creation of Automatic Databases in Fusion Environments

    International Nuclear Information System (INIS)

    The term 'Statistical Learning Systems' represents a wide set of methods to tackle specific problems related to classification (estimation of class decision boundaries), regression (determination of an unknown continuous function from noisy samples), and probability density estimation. Here, recent developments of learning systems in Fusion are reviewed. They have been focused on classification and regression problems as a specific way of creating ad-hoc databases of physical events. Classification and regression techniques have been used to determine the exact time instants in which events happen. In this way, databases around these times can be generated. Input data can be time-series data but also video-movies. Occasionally, the massive amount of data to be managed simultaneously forces the use of parallel computing techniques. Hybrid classification methods combining Support Vector Machines (SVM) and Bayesian statistics have been applied in JET for the automatic estimation of transition times between L/H and H/L confinement regimes. A universal regressor model based on SVM regression methods has been developed for the exact location of individual events. It has been applied to the JET database to determine the time instants of disruptions, ELMs and sawteeth activity. In addition, this regressor has been used with video-films to determine relevant temporal segments in JET discharges. Parallel codes have been put into operation to implement classification systems. The methods will be discussed and detailed examples will be given. This document is composed of an abstract followed by the presentation transparencies. (authors)

  7. An automatic measuring system of internal friction at low frequency

    International Nuclear Information System (INIS)

    An inverted torsion pendulum is automatized by means of Tectanel electronic system. Internal friction and the period of vibration are measured fully automatically as a function of temperature and the data obtained are analysed with a computer. (Author)

  8. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    Science.gov (United States)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  9. Data Collection Guidelines for Consistent Evaluation of Data from Verification and Monitoring Safeguard Systems

    International Nuclear Information System (INIS)

    One of the several activities the International Atomic Energy Agency (IAEA) inspectors perform in the verification process of Safeguard operations is the review and correlation of data from different sources. This process is often complex due to the different forms in which the data is presented. This paper describes some of the elements that are necessary to create a ''standardized'' structure for the verification of data. When properly collected and formatted, data can be analyzed with off-the shelf software applications using customized macros to automate the commands for the desired analysis. The standardized-data collection methodology is based on instrumentation guidelines as well as data structure elements, such as verifiable timing of data entry, automated data logging, identification codes, and others. The identification codes are used to associate data items with their sources and to correlate them with items from other data logging activities. The addition of predefined parameter ranges allows automated evaluation with the capability to provide a data summary, a cross-index of all data related to a specific event. Instances of actual databases are used as examples. The data collection guidelines described in this paper facilitate the use of data from a variety of instrumentation platforms and also allow the instrumentation itself to be more easily applied in subsequent monitoring applications

  10. Automatic bagout system

    International Nuclear Information System (INIS)

    Nuclear material entrained wastes are generated at the Plutonium Facility at Los Alamos National Laboratory. These wastes are removed from the glove box lines using the bagout method. This is a manual operation performed by technicians. An automated system is being developed to relieve the technicians from this task. The system will reduce the amount of accumulated radiation exposure to the worker. The primary components of the system consist of a six degree of freedom robot, a bag sealing device, and a small gantry robot. 1 ref., 5 figs

  11. 77 FR 39985 - Information Collection; Forest Industries and Residential Fuelwood and Post Data Collection Systems

    Science.gov (United States)

    2012-07-06

    ... Resources Planning Act of 1974 and the Forest and Rangeland Renewable Resources Research Act of 1978 require... addressed to: USDA, Forest Service, Attn: Ronald Piva, Northern Research Station, Forest Inventory and... Forest Service Information Collection; Forest Industries and Residential Fuelwood and Post...

  12. System for central monitoring, control, data acquisition, and automatic regulation for ARWQM

    OpenAIRE

    Marković Nataša Ž.; Džunić Jovana S.; Đorđević Đorđe R.; Gruber Günther

    2007-01-01

    This paper deals with construction of device for water quality monitoring on open water flows and connection of devices on individual river basins into central system for monitoring and supervision through GPRS modem. The device was tested on the WWTP (Waste Water Treatment Plant) in coal mine Čitluk, Sokobanja, and laboratory of the Faculty of Civil Engineering and Architecture Nis.

  13. System for central monitoring, control, data acquisition, and automatic regulation for ARWQM

    Directory of Open Access Journals (Sweden)

    Marković Nataša Ž.

    2007-01-01

    Full Text Available This paper deals with construction of device for water quality monitoring on open water flows and connection of devices on individual river basins into central system for monitoring and supervision through GPRS modem. The device was tested on the WWTP (Waste Water Treatment Plant in coal mine Čitluk, Sokobanja, and laboratory of the Faculty of Civil Engineering and Architecture Nis.

  14. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  15. System for Control,Data Collection and Processing in 8 mm Portable Microwave Radiometer—Scatterometer

    Institute of Scientific and Technical Information of China (English)

    李毅; 方振和; 等

    2002-01-01

    In this paper we describe a system used to control,collect and process data in 8mm portable microwave radiometer-scatterometer,We focus on hardware and software design of the system based on a PIC16F874 chip.The system has been successfully used in an 8mm portable microwave radiometer-scatterometer,compared with other similar systems,the system modularization miniatureization and intelligentization are improved so as to meet portable instrument requirements.

  16. Learning Diagnostic Diagrams in Transport-Based Data-Collection Systems

    DEFF Research Database (Denmark)

    Tran, Vu The; Eklund, Peter; Cook, Chris

    2014-01-01

    Insights about service improvement in a transit network can be gained by studying transit service reliability. In this paper, a general procedure for constructing a transit service reliability diagnostic (Tsrd) diagram based on a Bayesian network is proposed to automatically build a behavioural...... model from Automatic Vehicle Location (AVL) and Automatic Passenger Counters (APC) data. Our purpose is to discover the variability of transit service attributes and their effects on traveller behaviour. A Tsrd diagram describes and helps to analyse factors affecting public transport by combining domain...... knowledge with statistical data....

  17. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair B.

    2012-01-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific data recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of operating wind turbines. This report is intended to help develop a basic understanding of the data needed for reliability analysis from a Computerized Maintenance Management System (CMMS) and other data systems. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and analysis and reporting needs. The 'Motivation' section of this report provides a rationale for collecting and analyzing field data for reliability analysis. The benefits of this type of effort can include increased energy delivered, decreased operating costs, enhanced preventive maintenance schedules, solutions to issues with the largest payback, and identification of early failure indicators.

  18. Development of automatic reactor vessel inspection systems: development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. H.; Lim, H. T.; Um, B. G. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine the reactor vessel weldsIn order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed in this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition and analysis software was developed. 11 refs., 6 figs., 9 tabs. (Author)

  19. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    International Nuclear Information System (INIS)

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  20. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  1. Development of a system for data collection and processing by telemetry

    International Nuclear Information System (INIS)

    The environmental impact of nuclear industry is, obviously, a matter of the greatest concern. On account of that, a large number of parameters must be recorded during long periods with a high level of confidence. The site selection of brazilian nuclear power plants is conducted under this philosophy. Data acquisition of ocean related parameters in remote, non explored, areas is rather stringent. In order to avoid a series of problems with data collection and processing, a telemetric system concept was developed. Electronic aspects of this system are, mainly, emphasized. For such purpose the system is splitted into two sub-systems: the former for data collection, signal conditionning and transmission and the latter for signal reception and treatment. All parts of the systems were tested in the laboratory before an integrated check, the corresponding results being encouraging. The whole equipment was installed one year ago at the sea shore region of Peruibe, state of Sao Paulo, and is in operation, adequately, eversince. (Author)

  2. ON GEOMETRIC PROCESSING OF MULTI-TEMPORAL IMAGE DATA COLLECTED BY LIGHT UAV SYSTEMS

    Directory of Open Access Journals (Sweden)

    T. Rosnell

    2012-09-01

    Full Text Available Data collection under highly variable weather and illumination conditions around the year will be necessary in many applications of UAV imaging systems. This is a new feature in rigorous photogrammetric and remote sensing processing. We studied performance of two georeferencing and point cloud generation approaches using image data sets collected in four seasons (winter, spring, summer and autumn and under different imaging conditions (sunny, cloudy, different solar elevations. We used light, quadrocopter UAVs equipped with consumer cameras. In general, matching of image blocks collected with high overlaps provided high quality point clouds. All of the before mentioned factors influenced the point cloud quality. In winter time, the point cloud generation failed on uniform snow surfaces in many situations, and during leaf-off season the point cloud generation was not successful over deciduous trees. The images collected under cloudy conditions provided better point clouds than the images collected in sunny weather in shadowed regions and of tree surfaces. On homogeneous surfaces (e.g. asphalt the images collected under sunny conditions outperformed cloudy data. The tested factors did not influence the general block adjustment results. The radiometric sensor performance (especially signal-to-noise ratio is a critical factor in all weather data collection and point cloud generation; at the moment, high quality, light weight imaging sensors are still largely missing; sensitivity to wind is another potential limitation. There lies a great potential in low flying, low cost UAVs especially in applications requiring rapid aerial imaging for frequent monitoring.

  3. D3.2 Initial Specification of Data Collection and Analysis System

    DEFF Research Database (Denmark)

    Siksnys, Laurynas; Kaulakiene, Dalia; Pedersen, Torben Bach;

    2010-01-01

    equipped with such technology and will be sending tens of flexible offers per day. In order to appropriately manage very large volumes of flexible offers, a reliable, distributed, and highly scalable computer system infrastructure is needed. Work Package 3 concerns data collection, aggregation and storage...... solutions for the MIRACLE system. This deliverable, which is a part of Work Package 3, specifies a data collection and management system, its components, and presents a methodology for flexible offers aggregation......./produce. These flexible offers are collected, aggregated, and matched with production from RES. Afterwards, flexible offers are disaggregated and consumers/producers are informed when he or she should start energy consumption/production. We are estimating that in the future every household in Europe will be...

  4. Development of inspection data collection and evaluation system for large scale MOX fuel fabrication plant (2)

    International Nuclear Information System (INIS)

    Inspection Data Collection and Evaluation System is the system not only to storage inspection data and operator declaration data collected from various measurement equipments, which are installed in fuel fabrication processes of the large-scale MOX fuel fabrication plant, but also to make safeguards evaluation based on Near Real Time Accountancy (NRTA). Nuclear Material Control Center developed a simulator for model fuel fabrication process and to generate measurement data by simulating in-process material. A verification evaluation system built in the simulator calculates various statistics and conducts statistical tests in NRTA in order to verify the adequacy of material accountancy for the fabrication process. We are currently investigating influences of process elements such as amount of unmeasured material on the verification results, evaluating availabilities of verification conditions from current safeguards approach and searching detection capability in accordance with various elements of the material accountancy. We explain the simulation result of detection capabilities with variables that would influence the results. (author)

  5. JIGSAW: Acquisition, Display and Analysis system designed to collect data from Multiple Gamma-Ray detectors

    International Nuclear Information System (INIS)

    In this paper, the authors report on work performed to date on JIGSAW - a self contained data acquisition, display and analysis system designed to collect data form multiple gamma-ray detectors. The data acquisition system utilizes commercially available VMEbus and NIM hardware modules and the VME exec real time operating system. A Unix based software package, written in ANSI standard C and with the XII graphics routines, allows the user to view the acquired spectra. Analysis of the histograms can be performed in background during the run with the ROBFIT suite of curve fitting routines

  6. Automatic recovery from resource exhaustion exceptions by collecting leaked resources

    Institute of Scientific and Technical Information of China (English)

    Zi-ying DAI; Xiao-guang MAO; Li-qian CHEN; Yan LEI

    2014-01-01

    Despite the availability of garbage collectors, programmers must manually manage non-memory fi nite system resources such as fi le descriptors. Resource leaks can gradually consume all available resources and cause programs to raise resource exhaustion exceptions. However, programmers commonly provide no effective recovery approach for resource exhaustion exceptions, which often causes programs to halt without completing their tasks. In this paper, we propose to automatically recover programs from resource exhaustion exceptions caused by resource leaks. We transform programs to catch resource exhaustion exceptions, collect leaked resources, and then retry the failure code. A resource collector is designed to identify leaked resources and safely release them. We implement our approach for Java programs. Experimental results show that our approach can successfully handle resource exhaustion exceptions caused by reported resource leaks and allow programs to complete their tasks with an average execution time increase of 2.52%and negligible bytecode size increase.

  7. Analysis of ICPP process-monitoring-system data collected during August-October, 1981

    International Nuclear Information System (INIS)

    As part of the FY-1982 advanced safeguards development program, selected process data collected during August-October 1981 by the ICPP Process Monitoring Computer System (PMCS) were analyzed. This analysis is the first major effort of its kind using data from this VAX 11/780 computer based system. These data were from the first, second, and third processing cycles. Several process events were identified and isolated for analysis to conserve limited program resources. These included process input (G-Cell) batch transfers, continuous first-cycle feed activities, transfers into N-Cell intercycle storage, and continuous second-cycle feed activities. The analyses principally used Scanivalve plant precision data from tank bubbler probes, temperature data, and plant digital data. Some useful assessments are given to the process data information, but they should be considered preliminary since not all collected data could be analyzed. Also, several data limitations are noted and recommendations are given for system improvements. It is believed that this analysis effort demonstrates the potential utility of the system for improved safeguards applications; yet, further, similar analysis efforts are needed to extend and complete a demonstration to characterize ICPP process data in general

  8. Research on Chinese Antarctic Data Directory System I——Collecting, processing, examining and submitting data directory

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the general framework of ADDS (Antarctic Data Directory System) established by SCAR-COMNAP ad hoc Planning Group on Antarctic data management, the CN—ADDS (Chinese Antarctic Data Directory System ) project is going on, of which the research and activity keeps to the available method and technique in ADDS development and allows for the Chinese specific status in Antarctic data management as well. At present, authoring and submitting timely Antarctic data directory in China is one of the key issues that is to be dealt with necessarily. This paper aims at studying the technical procedure in collecting, processing, examining and submitting data directory. In additional, it also discusses the efficient collection of data directory, which needs the effort of administrative and technical support

  9. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.

    2009-09-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

  10. Automatic continuous monitoring system for dangerous sites and cargoes

    International Nuclear Information System (INIS)

    The problems of creation of automatic comprehensive continuous monitoring system for nuclear and radiation sites and cargoes of Rosatom Corporation, which carries out data collecting, processing, storage and transmission, including informational support to decision-making, as well as support to modelling and forecasting functions, are considered. The system includes components of two levels: site and industry. Currently the system is used to monitor over 8000 integrated parameters, which characterise the status of nuclear and radiation safety on Rosatom sites, environmental and fire safety

  11. Fault injection system for automatic testing system

    Institute of Scientific and Technical Information of China (English)

    王胜文; 洪炳熔

    2003-01-01

    Considering the deficiency of the means for confirming the attribution of fault redundancy in the re-search of Automatic Testing System(ATS) , a fault-injection system has been proposed to study fault redundancyof automatic testing system through compurison. By means of a fault-imbeded environmental simulation, thefaults injected at the input level of the software are under test. These faults may induce inherent failure mode,thus bringing about unexpected output, and the anticipated goal of the test is attained. The fault injection con-sists of voltage signal generator, current signal generator and rear drive circuit which are specially developed,and the ATS can work regularly by means of software simulation. The experimental results indicate that the faultinjection system can find the deficiency of the automatic testing software, and identify the preference of fault re-dundancy. On the other hand, some soft deficiency never exposed before can be identified by analyzing the tes-ting results.

  12. CHLOE: a system for the automatic handling of spark pictures

    International Nuclear Information System (INIS)

    The system for automatic data handling uses commercially available or state-of-the-art components. The system is flexible enough to accept information from various types of experiments involving photographic data acquisition

  13. Automatic multi-modal intelligent seizure acquisition (MISA) system for detection of motor seizures from electromyographic data and motion data

    DEFF Research Database (Denmark)

    Conradsen, Isa; Beniczky, Sándor; Wolf, Peter;

    2012-01-01

    measures of reconstructed sub-bands from the discrete wavelet transformation (DWT) and the wavelet packet transformation (WPT). Based on the extracted features all data segments were classified using a support vector machine (SVM) algorithm as simulated seizure or normal activity. A case study of the...

  14. Automatic force balance calibration system

    Science.gov (United States)

    Ferris, Alice T.

    1995-05-01

    A system for automatically calibrating force balances is provided. The invention uses a reference balance aligned with the balance being calibrated to provide superior accuracy while minimizing the time required to complete the calibration. The reference balance and the test balance are rigidly attached together with closely aligned moment centers. Loads placed on the system equally effect each balance, and the differences in the readings of the two balances can be used to generate the calibration matrix for the test balance. Since the accuracy of the test calibration is determined by the accuracy of the reference balance and current technology allows for reference balances to be calibrated to within +/-0.05% the entire system has an accuracy of +/-0.2%. The entire apparatus is relatively small and can be mounted on a movable base for easy transport between test locations. The system can also accept a wide variety of reference balances, thus allowing calibration under diverse load and size requirements.

  15. The Earth Phenomena Observation System (EPOS): A Coordination Manager for Enhanced Hurricane Data Collection

    Science.gov (United States)

    Abramson, M.; Kolitz, S.; Adams, M.; Carter, D.; Robinson, E.

    2012-12-01

    The Earth Phenomena Observation System (EPOS) has been developed over the past decade to manage current and future visions of a Sensor Web of Earth Science data collection missions. The initial solution we developed was a centralized coordination approach to tasking for a selected set of existing and notional satellites and unmanned vehicles (air and surface ship) to dynamically respond to transient phenomena that have a significant impact on human life such as hurricanes and wildfires. More recently we extended this work into the EPOS Coordination Manager that coordinates asynchronous distributed dynamic replanning of multiple single-mission systems. This approach is more conducive to coordinating current and planned Earth observation collection systems which are owned by separate organizations and tend to be controlled independently of each other. We compute a complex yet intuitive value function that enables us to tractably solve a series of optimization problems that allocate requests to the single mission planners, or "sub-planners." We consider requests with time windows and priority levels, some of which require simultaneous observations by different sensors. Currently we are enhancing the EPOS Coordination Manager to aid science data collection for current NASA missions, including the Hurricane and Severe Storm Sentinel (HS3), the Airborne Tropical TRopopause EXperiment (ATTREX), and the Earth Observing Mission 1 (EO-1). For the last of these missions, EO-1, we have contributed to EO-1 operations through several planning and scheduling tools we developed over the past several years. Given a current set of collection plans by EO-1 and missions that contribute to hurricane data collection such as NASA's Hurricane and Severe Storm Sentinel (HS3), GOES (Geostationary Operational Environmental Satellite), Aqua, Terra, Cloudsat, Calipso, TRMM (Tropical Rainfall Measuring Mission), and NPP (National Polar-orbiting Partnership), EPOS will address enhanced data

  16. Development of Inspection Data Collection and Evaluation System (IDES) for J-MOX (1)

    International Nuclear Information System (INIS)

    'Inspection Data and Collection and Evaluation System' is the system to storage inspection data and operator declaration data collected from various measurement equipments, which are installed in fuel fabrication processes of the large-scale MOX fuel fabrication plant, and to make safeguards evaluation using these data. Nuclear Material Control Center is now developing this system under the project commissioned by JSGO. By last fiscal year, we developed the simulator to simulate fuel fabrication process and generate data simulating in-process material inventory/flow and these measurement data. In addition, we developed a verification evaluation system to calculate various statistics from the simulation data and conduct statistical tests such as NRTA in order to verify the adequacy of material accountancy for the fabrication process. We are currently investigating the adequacy of evaluation itself and effects for evaluation by changing various process factors including unmeasured inventories as well as the adequacy of current safeguards approach. In the presentation, we explain the developed system configuration, calculation method of the simulation etc. and demonstrate same examples of the simulated result on material flow in the fabrication process and a part of the analytical results. (author)

  17. Real-time environmental radiation monitoring system with automatic restoration of backup data in site detector via communication using radio frequency

    International Nuclear Information System (INIS)

    An environmental radiation monitoring system based on high pressurized ionization chamber has been used for on-line gamma monitoring surrounding the KAERI (Korea Atomic Energy Research Institute), which transmits the dose data measured from ion chamber on the site via radio frequency to a central processing computer and stores the transmitted real-time data. Although communication using ratio frequency has several advantages such as effective and economical transmission, storage, and data process, there is one main disadvantage that data loss during transmission often happens because of unexpected communication problems. It is possible to restore the loss data by off-line such as floppy disk but the simultaneous process and display of current data as well as the backup data are very difficult in the present on-line system. In this work, a new electronic circuit board and the operation software applicable to the conventional environmental radiation monitoring system are developed and the automatical synchronization of the ion chamber unit and the central processing computer is carried out every day. This system is automatically able to restore the backup data within 34 hours without additional equipment and also display together the current data as well as the transmitted backup data after checking time flag

  18. A Research on the Application of Automatic Essay Scoring System to University’s English Writing Education in the Era of Big Data: Taking Pigaiwang as an Example

    OpenAIRE

    Liu, Ying(College of Nuclear Science and Technology, Beijing Normal University, 100875, Beijing, China)

    2015-01-01

    ields of life, such as the field of business, behavior analysis, education and so on, people in the world are facing changes from stem to stern. In China, Pigaiwang is one of the most popular online writing automatic essays scoring system among university students based on big data and cloud services (Zhang, 2013). How to deal with these newly sprout things is a big challenge for all concerned, and implementing a reform in university’s English writing education is also unavoidable. Therefore ...

  19. Implementation plan for automatic data processing equipment as part of the DYMAC advanced accountability system. Addendum 3 to applications of advanced accountability concepts in mixed oxide fabrication

    International Nuclear Information System (INIS)

    The Phase I study of the application of advanced accountability methods (DYMAC) in a uranium/plutonium mixed oxide facility was extended to include an implementation plan for the Automatic Data Processing System, as required by ERDA Manual Appendix 1830. The proposed system consists of a dual-control computer system with a minimum complement of peripheral equipment, which will be interfaced to the necessary measuring and display devices. Technical specifications for hardware and software system requirements are included, and cost estimates based on these specifications have been obtained

  20. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  1. ANPS - AUTOMATIC NETWORK PROGRAMMING SYSTEM

    Science.gov (United States)

    Schroer, B. J.

    1994-01-01

    Development of some of the space program's large simulation projects -- like the project which involves simulating the countdown sequence prior to spacecraft liftoff -- requires the support of automated tools and techniques. The number of preconditions which must be met for a successful spacecraft launch and the complexity of their interrelationship account for the difficulty of creating an accurate model of the countdown sequence. Researchers developed ANPS for the Nasa Marshall Space Flight Center to assist programmers attempting to model the pre-launch countdown sequence. Incorporating the elements of automatic programming as its foundation, ANPS aids the user in defining the problem and then automatically writes the appropriate simulation program in GPSS/PC code. The program's interactive user dialogue interface creates an internal problem specification file from user responses which includes the time line for the countdown sequence, the attributes for the individual activities which are part of a launch, and the dependent relationships between the activities. The program's automatic simulation code generator receives the file as input and selects appropriate macros from the library of software modules to generate the simulation code in the target language GPSS/PC. The user can recall the problem specification file for modification to effect any desired changes in the source code. ANPS is designed to write simulations for problems concerning the pre-launch activities of space vehicles and the operation of ground support equipment and has potential for use in developing network reliability models for hardware systems and subsystems. ANPS was developed in 1988 for use on IBM PC or compatible machines. The program requires at least 640 KB memory and one 360 KB disk drive, PC DOS Version 2.0 or above, and GPSS/PC System Version 2.0 from Minuteman Software. The program is written in Turbo Prolog Version 2.0. GPSS/PC is a trademark of Minuteman Software. Turbo Prolog

  2. Use of a viewdata system to collect data from a multicentre clinical trial in anaesthesia.

    OpenAIRE

    Waldron, H. A.; Cookson, R F

    1984-01-01

    The interactive electronic information storage and transmission system PRESTEL was assessed as a method of recording and collecting patient record forms from a multicentre trial in anaesthesia. PRESTEL terminals were provided in anaesthetic centres around Britain and all data handled by this public viewdata service, which connects users by telephone to a central computer. The trial was of a new analgesic supplement, alfentanil, and confirmed more rapid recovery of patients as compared with th...

  3. Detection of impulsive sources from an aerostat-based acoustic array data collection system

    Science.gov (United States)

    Prather, Wayne E.; Clark, Robert C.; Strickland, Joshua; Frazier, Wm. Garth; Singleton, Jere

    2009-05-01

    An aerostat based acoustic array data collection system was deployed at the NATO TG-53 "Acoustic Detection of Weapon Firing" Joint Field Experiment conducted in Bourges, France during the final two weeks of June 2008. A variety of impulsive sources including mortar, artillery, gunfire, RPG, and explosive devices were fired during the test. Results from the aerostat acoustic array will be presented against the entire range of sources.

  4. Automatic monitoring system for ''F'' installation

    International Nuclear Information System (INIS)

    The design and operation procedure of the first part of automatic radiation monitoring system of the Laboratory of Nuclear Problems, JINR, (''F'' Installation) are described. The system consists of 50 data measuring lines from which 30 are used to monitor by means of radiation de-- tectors; 12- to control the state of branch circuits, and orhers give auxiliary information on the accelerator performance. The data are handled and registered by a crate controller with built-in microcomputer once in some seconds. The monitoring results are output on a special light panel, a sound signaling and on a print

  5. A mobile field-work data collection system for the wireless era of health surveillance

    Directory of Open Access Journals (Sweden)

    Marianne Forsell

    2011-02-01

    Full Text Available In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on field-work experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS to a Local Area Network (LAN interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from field-work data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  6. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  7. An automatic system for crystal growth studies at constant supersaturation

    OpenAIRE

    March, J. G.; Costa-Bauzá, A.; F. Grases; Söhnel, O.

    1992-01-01

    An automatic system for growing crystals from seeded supersaturated solutions at constant supersaturation is described. Control of burettes and data acquisition are controlled by computer. The system was tested with a study of the calcium oxalate kinetics of crystal growth.

  8. An automatic visual analysis system for tennis

    OpenAIRE

    Connaghan, Damien; Moran, Kieran; O''Connor, Noel E.

    2013-01-01

    This article presents a novel video analysis system for coaching tennis players of all levels, which uses computer vision algorithms to automatically edit and index tennis videos into meaningful annotations. Existing tennis coaching software lacks the ability to automatically index a tennis match into key events, and therefore, a coach who uses existing software is burdened with time-consuming manual video editing. This work aims to explore the effectiveness of a system to automatically de...

  9. Digital signal processing for CdTe detectors using VXIbus data collection systems

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, Daiji; Takahashi, Hiroyuki; Kurahashi, Tomohiko; Iguchi, Tetsuo; Nakazawa, Masaharu

    1996-07-01

    Recently fast signal digitizing technique has been developed, and signal waveforms with very short time periods can be obtained. In this paper, we analyzed each measured pulse which was digitized by an apparatus of this kind, and tried to improve an energy resolution of a CdTe semiconductor detector. The result of the energy resolution for {sup 137}Cs 662 keV photopeak was 13 keV. Also, we developed a fast data collection system based on VXIbus standard, and the counting rate on this system was obtained about 50 counts per second. (author)

  10. Operating manual for the digital data-collection system for flow-control structures

    Science.gov (United States)

    Rorabaugh, J.I.; Rapp, W.L.

    1986-01-01

    This manual was written to help the user operate and maintain the digital data collection system for flow control structures. The system is used to measure daily discharge through river control dams. These dams commonly have tainter gates which are raised and lowered to keep the upper pool level relatively constant as the river flow changes. In order to measure the flow through such a structure, the positions of the tainter gates and the headwater and tailwater elevations must be known. From these data, the flow through the structure can be calculated. A typical digital data collection system is shown. Digitizing devices are mounted on the hoisting mechanism of each gate, as well as at the headwater and tailwater gages. Data from these digitizers are then routed by electrical cables to a central console where they are displayed and recorded on paper tape. If the dam has locks, a pressure-sensitive switch located in the lock activates a counter in the console which keeps track of the number of times the lock is drained and filled. (USGS)

  11. GAIT-ER-AID: An Expert System for Analysis of Gait with Automatic Intelligent Pre-Processing of Data

    OpenAIRE

    Bontrager, EL.; Perry, J.; Bogey, R.; Gronley, J.; Barnes, L.; Bekey, G.; Kim, JW

    1990-01-01

    This paper describes the architecture and applications of an expert system designed to identify the specific muscles responsible for a given dysfunctional gait pattern. The system consists of two parts: a data analysis expert system (DA/ES) and a gait pathology expert system (GP/ES). The DA/ES processes raw data on joint angles, foot-floor contact patterns and EMG's from relevant muscles and synthesizes them into a data frame for use by the GP/ES. Various aspects of the intelligent data pre-p...

  12. Creating an iPhone Application for Collecting Continuous ABC Data

    Science.gov (United States)

    Whiting, Seth W.; Dixon, Mark R.

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…

  13. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    International Nuclear Information System (INIS)

    A special liquid-nitrogen Dewar with double capacity for the sample-exchange robot has been created at AR-NE3A at the Photon Factory, allowing continuous fully automated data collection. In this work, this new system is described and the stability of its calibration is discussed. Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable

  14. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  15. The AmeriFlux Data Activity and Data System: An Evolving Collection of Data Management Techniques, Tools, Products and Services

    Energy Technology Data Exchange (ETDEWEB)

    Boden, Thomas A [ORNL; Krassovski, Misha B [ORNL; Yang, Bai [ORNL

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the U.S. Department of Energy and international climate change science since 1982. Over this period, climate change science has expanded from research focusing on basic understanding of geochemical cycles, particularly the carbon cycle, to integrated research addressing climate change impacts, vulnerability, adaptation, and mitigation. Interests in climate change data and information worldwide have grown remarkably and, as a result, so have demands and expectations for CDIAC s data systems. To meet the growing demands, CDIAC s strategy has been to design flexible data systems using proven technologies blended with new, evolving technologies and standards. CDIAC development teams are multidisciplinary and include computer science and information technology expertise, but also scientific expertise necessary to address data quality and documentation issues and to identify data products and system capabilities needed by climate change scientists. CDIAC has learned there is rarely a single commercial tool or product readily available to satisfy long-term scientific data system requirements (i.e., one size does not fit all and the breadth and diversity of environmental data are often too complex for easy use with commercial products) and typically deploys a variety of tools and data products in an effort to provide credible data freely to users worldwide. Like many scientific data management applications, CDIAC s data systems are highly customized to satisfy specific scientific usage requirements (e.g., developing data products specific for model use) but are also designed to be flexible and interoperable to take advantage of new software engineering techniques, standards (e.g., metadata standards) and tools and to support future Earth system data efforts (e.g., ocean acidification). CDIAC has provided data management

  16. Using global positioning systems in health research a practical approach to data collection and processing

    DEFF Research Database (Denmark)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper

    2011-01-01

    The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons acro...... an informed decision about incorporating this readily available technology into their studies. This work reflects the state of the art in 2011........ Recommendations are outlined for each stage of data collection and analysis and indicates challenges that should be considered. This paper highlights the benefits of collecting GPS data over traditional self-report or estimated exposure measures. Information presented here will allow researchers to make...

  17. ASUKA Hydrographic Data Collection

    OpenAIRE

    Uchida, Hiroshi; Imawaki, Shiro; Ichikawa, Hiroshi

    2008-01-01

    Repeated hydrographic surveys across the Kuroshio and its recirculation south of Japan were carried out by a group called ASUKA (Affiliated Surveys of the Kuroshio off Cape Ashizuri) since 1992, Conductivity-temperature-depth profiler (CTD), expendable CTD (XCTD), expendable bathythermograph (XBT), and digital bathythermograph (DBT) data obtained from 155 cruises were collected for a period of 16 years, from November 1992 to May 2008. A uniform data processing was applied to raw data from the...

  18. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  19. GIS: Geographic Information System An application for socio-economical data collection for rural area

    CERN Document Server

    Nayak, S K; Kalyankar, N V

    2010-01-01

    The country India follows the planning through planning commission. This is on the basis of information collected by traditional, tedious and manual method which is too slow to sustain. Now we are in the age of 21th century. We have seen in last few decades that the progress of information technology with leaps and bounds, which have completely changed the way of life in the developed nations. While internet has changed the established working practice and opened new vistas and provided a platform to connect, this gives the opportunity for collaborative work space that goes beyond the global boundary. We are living in the global economy and India leading towards Liberalize Market Oriented Economy (LMOE). Considering this things, focusing on GIS, we proposed a system for collection of socio economic data and water resource management information of rural area via internet.

  20. Video Analytics Algorithm for Automatic Vehicle Classification (Intelligent Transport System)

    OpenAIRE

    ArtaIftikhar; Ali Javed

    2013-01-01

    Automated Vehicle detection and classification is an important component of intelligent transport system. Due to significant importance in various fields such as traffic accidents avoidance, toll collection, congestion avoidance, terrorist activities monitoring, security and surveillance systems, intelligent transport system has become important field of study. Various technologies have been used for detecting and classifying vehicles automatically. Automated vehicle detection is broadly divi...

  1. 次声信号数据采集系统的研制%Development of infrasound signal data collecting system

    Institute of Scientific and Technical Information of China (English)

    易南; 陈景藻; 李玲; 贾克勇

    2001-01-01

    Infrasound signal frequency of 0~20 Hz was collected byinfrasound signal data collecting system. Sound collecting apparatus transformed the signal into voltage signal correspondingly. The computer analyzed the mainly frequency and the density of the infrasound and also illustrated the results and gave printing curves automatically.%次声信号数据采集系统所采集的是0~20Hz的次声信号,该信号通过传声器转换为相应的电压信号.计算机实时采集、处理次声信号,分析次声信号的各主要频率成分和强度大小,对分析结果进行图形显示,曲线自动输出,并生成、打印最终结果.

  2. Programs for the automatic gamma-ray measurement with CANBERRA 8100/QUANTA system

    International Nuclear Information System (INIS)

    Some programs have been prepared for the automatic operation of the CANBERRA 8100/QUANTA System for the gamma-ray spectrum measurement. The main parts of these programs are: (1) to collect and record on magnetic disks the data of gamma-ray spectra automatically, while the recorded data are analyzed to estimate the nuclides which generate photopeaks of spectra and to calculate those concentrations; (2) to draw plotted diagrams of pulse height distributions of gamma-ray spectra data and other data by the additional digital plotter; and etc. (author)

  3. Automatic acquisition and classification system for agricultural network information based on Web data%基于Web数据的农业网络信息自动采集与分类系统

    Institute of Scientific and Technical Information of China (English)

    段青玲; 魏芳芳; 张磊; 肖晓琰

    2016-01-01

    The purpose of this study is to obtain agricultural web information efficiently, and to provide users with personalized service through the integration of agricultural resources scattered in different sites and the fusion of heterogeneous environmental data. The research in this paper has improved some key information technologies, which are agricultural web data acquisition and extraction technologies, text classification based on support vector machine (SVM) and heterogeneous data collection based on the Internet of things (IOT). We first add quality target seed site into the system, and get website URL (uniform resource locator) and category information. The web crawler program can save original pages. The de-noised web page can be obtained through HTML parser and regular expressions, which create custom Node Filter objects. Therefore, the system builds a document object model (DOM) tree before digging out data area. According to filtering rules, the target data area can be identified from a plurality of data regions with repeated patterns. Next, the structured data can be extracted after property segmentation. Secondly, we construct linear SVM classification model, and realize agricultural text classification automatically. The procedures of our model include 4 steps. First of all, we use segment tool ICTCLAS to carry out the word segment and part-of-speech (POS) tagging, followed by combining agricultural key dictionary and document frequency adjustment rule to choose feature words, and building a feature vector and calculating inverse document frequency (IDF) weight value for feature words; lastly we design adaptive classifier of SVM algorithm. Finally, the perception data of different format collected by the sensor are transmitted to the designated server as the source data through the wireless sensor network. Relational database in accordance with specified acquisition frequency can be achieved through data conversion and data filtering. The key step of

  4. Profiling animal toxicants by automatically mining public bioassay data: a big data approach for computational toxicology.

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    Full Text Available In vitro bioassays have been developed and are currently being evaluated as potential alternatives to traditional animal toxicity models. Already, the progress of high throughput screening techniques has resulted in an enormous amount of publicly available bioassay data having been generated for a large collection of compounds. When a compound is tested using a collection of various bioassays, all the testing results can be considered as providing a unique bio-profile for this compound, which records the responses induced when the compound interacts with different cellular systems or biological targets. Profiling compounds of environmental or pharmaceutical interest using useful toxicity bioassay data is a promising method to study complex animal toxicity. In this study, we developed an automatic virtual profiling tool to evaluate potential animal toxicants. First, we automatically acquired all PubChem bioassay data for a set of 4,841 compounds with publicly available rat acute toxicity results. Next, we developed a scoring system to evaluate the relevance between these extracted bioassays and animal acute toxicity. Finally, the top ranked bioassays were selected to profile the compounds of interest. The resulting response profiles proved to be useful to prioritize untested compounds for their animal toxicity potentials and form a potential in vitro toxicity testing panel. The protocol developed in this study could be combined with structure-activity approaches and used to explore additional publicly available bioassay datasets for modeling a broader range of animal toxicities.

  5. An integrated system for managing multidisciplinary oceanographic data collected in the Mediterranean Sea during the basin-scale research project EU/MAST-MATER (1996 2000)

    Science.gov (United States)

    Maillard, C.; Balopoulos, E.; Giorgetti, A.; Fichaut, M.; Iona, A.; Larour, M.; Latrouite, A.; Manca, B.; Maudire, G.; Nicolas, P.; Sanchez-Cabeza, J.-A.

    2002-06-01

    An advanced computer and communication technology was used to develop an integrated system and software tools for managing a great diversity of oceanographic data collected in the Mediterranean Sea during 1996-2000. Data were obtained during 108 sea cruises, carried out within the framework of the large-scale international research project MATER (mass transfer and ecosystem response), which was financially supported by the Marine Science and Technology (MAST) Programme of the European Union (EU). Data collection involved the active participation of various research vessels and personnel coming from 58 different laboratories of 13 countries. Data formatting as well as automatic and visual data quality controls were implemented using internationally accepted standards and procedures. Various data inventories and meta-data information, accessible through the World Wide Web (WWW), are made available to the user community. A database was developed, which, along with meta-data and other data relevant to the project information, is made available to the user community in the form of a CD-ROM. The database consists of 5861 vertical profiles and 842 time series of basic physical and biogeochemical parameters collected in the seawater column as well as biogeochemical parameters from the analysis of 70 sediment cores. Furthermore, it includes 67 cruise data files of nonstandard additional biological and atmospheric parameters.

  6. Issues of data collection and use for quantifying the impacts of energy installations and systems

    International Nuclear Information System (INIS)

    The paper discusses several critical issues in the construction of models for assessing the impacts of energy installations and systems. Some of these are connected with the process of data collection and use; it is pointed out that different methods have to be applied according to the purpose envisaged for a particular study (e.g. plant licensing, environmental assessment or energy planning). Further concerns are discussed, related to the common need to aggregate data and, in relation to the actual client or target group of the work, to present results in a sufficiently generic fashion. The paper discusses aggregation over technologies, over sites, over time and over social settings. Regarding the actual technique used for impact calculations, the differences between externality calculations, extended risk analysis and life cycle analysis are described. Finally, the issue of quantification is illustrated by two very difficult but also very important examples: global climate impacts and impacts of nuclear accidents. (author). 9 refs, 1 fig., 2 tabs

  7. Collective Analysis of Qualitative Data

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Friberg, Karin

    2014-01-01

    What. Many students and practitioners do not know how to systematically process qualitative data once it is gathered—at least not as a collective effort. This chapter presents two workshop techniques, affinity diagramming and diagnostic mapping, that support collective analysis of large amounts of...... qualitative data. Affinity diagramming is used to make collective analysis and interpretations of qualitative data to identify core problems that need to be addressed in the design process. Diagnostic mapping supports collective interpretation and description of these problems and how to intervene in them. We...... explain the techniques through a case where they were used to analyze why a new elec- tronic medical record system introduced life-threatening situations for patients. Why. Collective analyses offer all participants a voice, visualize their contributions, combine different actors’ perspectives, and anchor...

  8. Overview of the SOFIA Data Processing System: A generalized system for manual and automatic data processing at the SOFIA Science Center

    CERN Document Server

    Shuping, R Y; Vacca, W D; Charcos-Llorens, M; Reach, W T; Alles, R; Clarke, M; Melchiorri, R; Radomski, J; Shenoy, S; Sandel, D; Omelian, E B

    2014-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5-meter telescope mounted in the aft section of a Boeing 747SP aircraft. During routine operations, several instruments will be available to the astronomical community including cameras and spectrographs in the near- to far-IR. Raw data obtained in-flight require a significant amount of processing to correct for background emission (from both the telescope and atmosphere), remove instrumental artifacts, correct for atmospheric absorption, and apply both wavelength and flux calibration. In general, this processing is highly specific to the instrument and telescope. In order to maximize the scientific output of the observatory, the SOFIA Science Center must provide these post-processed data sets to Guest Investigators in a timely manner. To meet this requirement, we have designed and built the SOFIA Data Processing System (DPS): an in-house set of tools and services that can be used in both auto...

  9. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship `Mutsu`. Automation technique was verified by `Mutsu` plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  10. Feasibility Study for Ballet E-Learning: Automatic Composition System for Ballet "Enchainement" with Online 3D Motion Data Archive

    Science.gov (United States)

    Umino, Bin; Longstaff, Jeffrey Scott; Soga, Asako

    2009-01-01

    This paper reports on "Web3D dance composer" for ballet e-learning. Elementary "petit allegro" ballet steps were enumerated in collaboration with ballet teachers, digitally acquired through 3D motion capture systems, and categorised into families and sub-families. Digital data was manipulated into virtual reality modelling language (VRML) and fit…

  11. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  12. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    Science.gov (United States)

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  13. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  14. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  15. Computer systems for automatic earthquake detection

    Science.gov (United States)

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  16. Coke oven automatic combustion control system

    Energy Technology Data Exchange (ETDEWEB)

    Shihara, Y.

    1981-01-01

    This article describes and discusses the development and application of an automatic combustion control system for coke ovens that has been used at the Yawata Works of the Nippon Steel Corporation, Japan. (In Japanese)

  17. Automatic calibration system for pressure transducers

    Science.gov (United States)

    1968-01-01

    Fifty-channel automatic pressure transducer calibration system increases quantity and accuracy for test evaluation calibration. The pressure transducers are installed in an environmental tests chamber and manifolded to connect them to a pressure balance which is uniform.

  18. DataCollection Prototyping

    CERN Multimedia

    Beck, H.P.

    DataCollection is a subsystem of the Trigger, DAQ & DCS project responsible for the movement of event data from the ROS to the High Level Triggers. This includes data from Regions of Interest (RoIs) for Level 2, building complete events for the Event Filter and finally transferring accepted events to Mass Storage. It also handles passing the LVL1 RoI pointers and the allocation of Level 2 processors and load balancing of Event Building. During the last 18 months DataCollection has developed a common architecture for the hardware and software required. This involved a radical redesign integrating ideas from separate parts of earlier TDAQ work. An important milestone for this work, now achieved, has been to demonstrate this subsystem in the so-called Phase 2A Integrated Prototype. This prototype comprises the various TDAQ hardware and software components (ROSs, LVL2, etc.) under the control of the TDAQ Online software. The basic functionality has been demonstrated on small testbeds (~8-10 processing nodes)...

  19. Dynamic Automatic Noisy Speech Recognition System (DANSR)

    OpenAIRE

    Paul, Sheuli

    2014-01-01

    In this thesis we studied and investigated a very common but a long existing noise problem and we provided a solution to this problem. The task is to deal with different types of noise that occur simultaneously and which we call hybrid. Although there are individual solutions for specific types one cannot simply combine them because each solution affects the whole speech. We developed an automatic speech recognition system DANSR ( Dynamic Automatic Noisy Speech Recognition System) for hybri...

  20. Data mining of geospatial data: combining visual and automatic methods

    OpenAIRE

    Demšar, Urška

    2006-01-01

    Most of the largest databases currently available have a strong geospatial component and contain potentially useful information which might be of value. The discipline concerned with extracting this information and knowledge is data mining. Knowledge discovery is performed by applying automatic algorithms which recognise patterns in the data. Classical data mining algorithms assume that data are independently generated and identically distributed. Geospatial data are multidimensional, spatial...

  1. Monitoring fish communities at drifting FADs: an autonomous system for data collection in an ecosystems approach

    OpenAIRE

    Brehmer, Patrice; Sancho, Gorka; Josse, Erwan; Taquet, Marc; Georgakarakos, Stratis; Itano, David; Moreno, Gala; Palud, Pierre; Trygonis, Vasilis; Aumeeruddy, Riaz; Girard, Charlotte; Peignon, Christophe; Dalen, John; Dagorn, Laurent

    2009-01-01

    An increasing proportion of landings by tuna purse seine fishing vessels are taken around drifting Fish Aggregating Devices (FADs). Although these FADs and their use by the fishing industry to capture tropical tuna have been well documented, operative tools to collect data around them are now required. Acoustic, video, photographic and visual data were collected on fish aggregations around drifting FADs in offshore waters of the western Indian Ocean. Multibeam sonars, multifreq...

  2. Steam System Balancing and Tuning for Multifamily Residential Buildings in Chicagoland - Second Year of Data Collection

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.; Ludwig, P.; Brand, L.

    2013-08-01

    Steam heated buildings often suffer from uneven heating as a result of poor control of the amount of steam entering each radiator. In order to satisfy the heating load to the coldest units, other units are overheated. As a result, some tenants complain of being too hot and open their windows in the middle of winter, while others complain of being too cold and are compelled to use supplemental heat sources. Building on previous research, CNT Energy identified 10 test buildings in Chicago and conducted a study to identify best practices for the methodology, typical costs, and energy savings associated with steam system balancing. A package of common steam balancing measures was assembled and data were collected on the buildings before and after these retrofits were installed to investigate the process, challenges, and the cost effectiveness of improving steam systems through improved venting and control systems. The test buildings that received venting upgrades and new control systems showed 10.2% savings on their natural gas heating load, with a simple payback of 5.1 years. The methodologies for and findings from this study are presented in detail in this report. This report has been updated from a version published in August 2012 to include natural gas usage information from the 2012 heating season and updated natural gas savings calculations.

  3. An observing system for the collection of fishery and oceanographic data

    Directory of Open Access Journals (Sweden)

    P. Falco

    2007-05-01

    Full Text Available Fishery Observing System (FOS was developed as a first and basic step towards fish stock abundance nowcasting/forecasting within the framework of the EU research program Mediterranean Forecasting System: Toward an Environmental Prediction (MFSTEP. The study of the relationship between abundance and environmental parameters also represents a crucial point towards forecasting. Eight fishing vessels were progressively equipped with FOS instrumentation to collect fishery and oceanographic data. The vessels belonged to different harbours of the Central and Northern Adriatic Sea. For this pilot application, anchovy (Engraulis encrasicolus, L. was chosen as the target species. Geo-referenced catch data, associated with in-situ temperature and depth, were the FOS products but other parameters were associated with catch data as well. MFSTEP numerical circulation models provide many of these data. In particular, salinity was extracted from re-analysis data of numerical circulation models. Satellite-derived sea surface temperature (SST and chlorophyll were also used as independent variables. Catch and effort data were used to estimate an abundance index (CPUE – Catch per Unit of Effort. Considering that catch records were gathered by different fishing vessels with different technical characteristics and operating on different fish densities, a standardized value of CPUE was calculated. A spatial and temporal average CPUE map was obtained together with a monthly mean time series in order to characterise the variability of anchovy abundance during the period of observation (October 2003–August 2005. In order to study the relationship between abundance and oceanographic parameters, Generalized Additive Models (GAM were used. Preliminary results revealed a complex scenario: the southern sector of the domain is characterised by a stronger relationship than the central and northern sector where the interactions between the environment and the anchovy

  4. Automatic Aircraft Collision Avoidance System and Method

    Science.gov (United States)

    Skoog, Mark (Inventor); Hook, Loyd (Inventor); McWherter, Shaun (Inventor); Willhite, Jaimie (Inventor)

    2014-01-01

    The invention is a system and method of compressing a DTM to be used in an Auto-GCAS system using a semi-regular geometric compression algorithm. In general, the invention operates by first selecting the boundaries of the three dimensional map to be compressed and dividing the three dimensional map data into regular areas. Next, a type of free-edged, flat geometric surface is selected which will be used to approximate terrain data of the three dimensional map data. The flat geometric surface is used to approximate terrain data for each regular area. The approximations are checked to determine if they fall within selected tolerances. If the approximation for a specific regular area is within specified tolerance, the data is saved for that specific regular area. If the approximation for a specific area falls outside the specified tolerances, the regular area is divided and a flat geometric surface approximation is made for each of the divided areas. This process is recursively repeated until all of the regular areas are approximated by flat geometric surfaces. Finally, the compressed three dimensional map data is provided to the automatic ground collision system for an aircraft.

  5. Data collection techniques.

    Science.gov (United States)

    Morgan, G A; Harmon, R J

    2001-08-01

    We have provided an overview of techniques used to assess variables in the applied behavioral sciences. Most of the methods are used by both quantitative/positivist and qualitative/constructivist researchers but to different extents. Qualitative researchers prefer more open-ended, less structured data collection techniques than do quantitative researchers. Direct observation of participants is common in experimental and qualitative research; it is less common in so-called survey research, which tends to use self-report questionnaires. It is important that investigators use instruments that are reliable and valid for the population and purpose for which they will be used. Standardized instruments have manuals that provide norms and indexes of reliability and validity. However, if the populations and purpose on which these data are based are different from yours, it may be necessary for you to develop your own instrument or provide new evidence of reliability and validity. PMID:11501698

  6. Automatic program debugging for intelligent tutoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Murray, W.R.

    1986-01-01

    This thesis explores the process by which student programs can be automatically debugged in order to increase the instructional capabilities of these systems. This research presents a methodology and implementation for the diagnosis and correction of nontrivial recursive programs. In this approach, recursive programs are debugged by repairing induction proofs in the Boyer-Moore Logic. The potential of a program debugger to automatically debug widely varying novice programs in a nontrivial domain is proportional to its capabilities to reason about computational semantics. By increasing these reasoning capabilities a more powerful and robust system can result. This thesis supports these claims by examining related work in automated program debugging and by discussing the design, implementation, and evaluation of Talus, an automatic degugger for LISP programs. Talus relies on its abilities to reason about computational semantics to perform algorithm recognition, infer code teleology, and to automatically detect and correct nonsyntactic errors in student programs written in a restricted, but nontrivial, subset of LISP.

  7. Scheduling Garbage Collection in Embedded Systems

    OpenAIRE

    Henriksson, Roger

    1998-01-01

    The complexity of systems for automatic control and other safety-critical applications grows rapidly. Computer software represents an increasing part of the complexity. As larger systems are developed, we need to find scalable techniques to manage the complexity in order to guarantee high product quality. Memory management is a key quality factor for these systems. Automatic memory management, or garbage collection, is a technique that significantly reduces the complex problem of correct memo...

  8. Automatic processing of macromolecular crystallography X-ray diffraction data at the ESRF.

    Science.gov (United States)

    Monaco, Stéphanie; Gordon, Elspeth; Bowler, Matthew W; Delagenière, Solange; Guijarro, Matias; Spruce, Darren; Svensson, Olof; McSweeney, Sean M; McCarthy, Andrew A; Leonard, Gordon; Nanao, Max H

    2013-06-01

    The development of automated high-intensity macromolecular crystallography (MX) beamlines at synchrotron facilities has resulted in a remarkable increase in sample throughput. Developments in X-ray detector technology now mean that complete X-ray diffraction datasets can be collected in less than one minute. Such high-speed collection, and the volumes of data that it produces, often make it difficult for even the most experienced users to cope with the deluge. However, the careful reduction of data during experimental sessions is often necessary for the success of a particular project or as an aid in decision making for subsequent experiments. Automated data reduction pipelines provide a fast and reliable alternative to user-initiated processing at the beamline. In order to provide such a pipeline for the MX user community of the European Synchrotron Radiation Facility (ESRF), a system for the rapid automatic processing of MX diffraction data from single and multiple positions on a single or multiple crystals has been developed. Standard integration and data analysis programs have been incorporated into the ESRF data collection, storage and computing environment, with the final results stored and displayed in an intuitive manner in the ISPyB (information system for protein crystallography beamlines) database, from which they are also available for download. In some cases, experimental phase information can be automatically determined from the processed data. Here, the system is described in detail. PMID:23682196

  9. An automatic system for multielement solvent extractions

    International Nuclear Information System (INIS)

    The automatic system described is suitable for multi-element separations by solvent extraction techniques with organic solvents heavier than water. The analysis is run automatically by a central control unit and includes steps such as pH regulation and reduction or oxidation. As an example, the separation of radioactive Hg2+, Cu2+, Mo6+, Cd2+, As5+, Sb5+, Fe3+, and Co3+ by means of diethyldithiocarbonate complexes is reported. (Auth.)

  10. Temperature Profile Data Collected by Participating Ships in NOAA's Shipboard Environmental Data Acquisition System Program from 17 June 2000 to 23 February 2001 (NODC Accession 0000417)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — XBT and other data were collected from the COLUMBUS COROMANDEL and other platforms participating in NOAA's Shipboard Environmental Data Acquisition System (SEAS)...

  11. Automatic molecular collection and detection by using fuel-powered microengines

    Science.gov (United States)

    Han, Di; Fang, Yangfu; Du, Deyang; Huang, Gaoshan; Qiu, Teng; Mei, Yongfeng

    2016-04-01

    We design and fabricate a simple self-powered system to collect analyte molecules in fluids for surface-enhanced Raman scattering (SERS) detection. The system is based on catalytic Au/SiO/Ti/Ag-layered microengines by employing rolled-up nanotechnology. Pronounced SERS signals are observed on microengines with more carrier molecules compared with the same structure without automatic motions.We design and fabricate a simple self-powered system to collect analyte molecules in fluids for surface-enhanced Raman scattering (SERS) detection. The system is based on catalytic Au/SiO/Ti/Ag-layered microengines by employing rolled-up nanotechnology. Pronounced SERS signals are observed on microengines with more carrier molecules compared with the same structure without automatic motions. Electronic supplementary information (ESI) available: Experimental procedures, characterization, SERS enhancement factor calculation and videos. See DOI: 10.1039/c6nr00117c

  12. Robust indexing for automatic data collection

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2003-12-09

    We present improved methods for indexing diffraction patterns from macromolecular crystals. The novel procedures include a more robust way to verify the position of the incident X-ray beam on the detector, an algorithm to verify that the deduced lattice basis is consistent with the observations, and an alternative approach to identify the metric symmetry of the lattice. These methods help to correct failures commonly experienced during indexing, and increase the overall success rate of the process. Rapid indexing, without the need for visual inspection, will play an important role as beamlines at synchrotron sources prepare for high-throughput automation.

  13. Art painting Data Collection

    OpenAIRE

    Martinez Saez, Jodi

    2011-01-01

    The classification of Art painting images is a computer vision applications that is growing considerably. The goal of this technology, is to classify an art painting image automatically, in terms of artistic style, technique used, or its author. For this purpose, the image is analyzed extracting some visual features. Many articles related with these problems have been issued, but in general the proposed solutions are focused in a very specific field. In particular, algorithms a...

  14. Automatic seismic support design of piping system by an object oriented expert system

    International Nuclear Information System (INIS)

    The seismic support design of piping systems of nuclear power plants requires many experienced engineers and plenty of man-hours, because the seismic design conditions are very severe, the bulk volume of the piping systems is hyge and the design procedures are very complicated. Therefore we have developed a piping seismic design expert system, which utilizes the piping design data base of a 3 dimensional CAD system and automatically determines the piping support locations and support styles. The data base of this system contains the maximum allowable seismic support span lengths for straight piping and the span length reduction factors for bends, branches, concentrated masses in the piping, and so forth. The system automatically produces the support design according to the design knowledge extracted and collected from expert design engineers, and using design information such as piping specifications which give diameters and thickness and piping geometric configurations. The automatic seismic support design provided by this expert system achieves in the reduction of design man-hours, improvement of design quality, verification of design result, optimization of support locations and prevention of input duplication. In the development of this system, we had to derive the design logic from expert design engineers and this could not be simply expressed descriptively. Also we had to make programs for different kinds of design knowledge. For these reasons we adopted the object oriented programming paradigm (Smalltalk-80) which is suitable for combining programs and carrying out the design work

  15. Automatic dam concrete placing system; Dam concrete dasetsu sagyo no jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    Yoneda, Y.; Hori, Y.; Nakayama, T.; Yoshihara, K.; Hironaka, T. [Okumura Corp., Osaka (Japan)

    1994-11-15

    An automatic concrete placing system was developed for concrete dam construction. This system consists of the following five subsystems: a wireless data transmission system, an automatic dam concrete mixing system, a consistency determination system, an automatic dam concrete loading and transporting system, and a remote concrete bucket opening and closing system. The system includes the following features: mixing amount by mixing ratio and mixing intervals can be instructed from a concrete placing site by using a wireless handy terminal; concrete is mixed automatically in a batcher plant; a transfer car is started, and concrete is charged into a bucket automatically; the mixed concrete is determined of its properties automatically; labor cost can be reduced, the work efficiency improved, and the safety enhanced; and the system introduction has resulted in unattended operation from the aggregate draw-out to a bunker line, manpower saving of five persons, and reduction in cycle time by 10%. 11 figs., 2 tabs.

  16. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  17. Automatic laser tracking and ranging system.

    Science.gov (United States)

    Cooke, C R

    1972-02-01

    An automatic laser tracking and ranging system has been developed for use with cooperative retroreflective targets. Target position is determined with high precision at ranges out to 19 km and sample rates up to one hundred measurements per second. The data are recorded on a magnetic tape in the form of azimuth, elevation, range, and standard time and are computer-compatible. The system is fully automatic with the exception of the initial acquisition sequence, which is performed manually. This eliminates the need for expensive and time-consuming photographic data reduction. Also, position is uniquely determined by a single instrument. To provide convenient operation at remote sites, the system is van-mounted and operates off a portable power generator. The transmitter is a flash-pumped Q-spoiled Nd:YAG laser developing 1 MW peak power in a 10-mrad beam at a rate of 100 pps. The beam, which is coaxial with the receiver, is directed to the target by an azimuth-elevation mirror mount. The return beam is imaged o separate ranging and tracking receivers. The ranging receiver measures time of flight of the 25-nsec laser pulse with range accuracies of +/-15 cm. The tracking receiver uses a quadrant photodiode followed by matched log video amplifiers and achieves a tracking accuracy of +/-0.1 mrad. An optical dynamic range of 30 dB is provided to minimize error due to scintillation. Also, 80 dB of optical dynamic range is provided by adjustable neutral density filters to compensate for changes in target range. PMID:20111495

  18. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  19. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  20. A versatile automatic TLD system under development

    International Nuclear Information System (INIS)

    This paper describes an automatic TLD personnel monitoring system intended to replace the filmbadges used by the Radiological Service Unit TNO. The basis of the system is a versatile automatic TLD reader in which the detectors are heated with hot nitrogen gas. After a short description of the reader and some experimental results, a prototype of a TLD badge designed for automatic processing is presented. In this badge - which is waterproof and cannot be opened by the wearer - up to four detectors can be mounted, covered by appropriate filters. A variety of TLD's may be used such as discs, hot-pressed chips and rods. The system is completed with a fingering dosemeter, the detector holder of which can be separated from the ring for other applications (e.g. dosimetry in X-ray diagnostics). (author)

  1. Feedback Improvement in Automatic Program Evaluation Systems

    Science.gov (United States)

    Skupas, Bronius

    2010-01-01

    Automatic program evaluation is a way to assess source program files. These techniques are used in learning management environments, programming exams and contest systems. However, use of automated program evaluation encounters problems: some evaluations are not clear for the students and the system messages do not show reasons for lost points.…

  2. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  3. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  4. Multiple on-line data collection and processing for radioimmunoassy using a micro-computer system.

    OpenAIRE

    Carter, N. W.; Davidson, D.; Lucas, D F; Griffiths, P.D.

    1980-01-01

    A micro-computer system is described which has been designed to perform on-line data capture from up to seven radioisotope counters of different types in parallel with interactive results processing and subsequent transmission to a laboratory computer-based data management system.

  5. Automatic exploitation system for photographic dosemeters

    International Nuclear Information System (INIS)

    The Laboratory of Dosimetry Exploitation (LED) has realized an equipment allowing to exploit automatically photographic film dosemeters. This system uses an identification of the films by code-bars and gives the doses measurement with a completely automatic reader. The principle consists in putting in ribbon the emulsions to be exploited and to develop them in a circulation machine. The measurement of the blackening film is realized on a reading plate having fourteen points of reading, in which are circulating the emulsions in ribbon. The exploitation is made with the usual dose calculation method, with special computers codes. A comparison on 2000 dosemeters has shown that the results are the same in manual and automatical methods. This system has been operating since July 1995 by the LED. (N.C.)

  6. First semiannual report: Rocky Flats Small Wind Systems Test Center activities. Volume II. Experimental data collected from small wind energy conversion systems

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-09-28

    Volume II of the First Semiannual Report of the Rocky Flats Small Wind Systems Test Center (WSTC) describes the nine (9) small wind energy conversion systems (SWECS) tested as of June 30, 1978 and provides the significant quantitative and qualitative data collected to that date. Meteorological data collected at Rocky Flats are also provided and described.

  7. Automatic data processing and crustal modeling on Brazilian Seismograph Network

    Science.gov (United States)

    Moreira, L. P.; Chimpliganond, C.; Peres Rocha, M.; Franca, G.; Marotta, G. S.; Von Huelsen, M. G.

    2014-12-01

    The Brazilian Seismograph Network (RSBR) is a joint project of four Brazilian research institutions with the support of Petrobras and its main goal is to monitor the seismic activities, generate alerts of seismic hazard and provide data for Brazilian tectonic and structure research. Each institution operates and maintain their seismic network, sharing their data in an virtual private network. These networks have seismic stations transmitting in real time (or near real time) raw data to their respective data centers, where the seismogram files are then shared with other institutions. Currently RSBR has 57 broadband stations, some of them operating since 1994, transmitting data through mobile phone data networks or satellite links. Station management, data acquisition and storage and earthquake data processing at the Seismological Observatory of the University of Brasilia is automatically performed by SeisComP3 (SC3). However, the SC3 data processing is limited to event detection, location and magnitude. An automatic crustal modeling system was designed process raw seismograms and generate 1D S-velocity profiles. This system automatically calculates receiver function (RF) traces, Vp/Vs ratio (h-k stack) and surface waves dispersion (SWD) curves. These traces and curves are then used to calibrate the lithosphere seismic velocity models using a joint inversion scheme The results can be reviewed by an analyst, change processing parameters and selecting/neglecting RF traces and SWD curves used in lithosphere model calibration. The results to be obtained from this system will be used to generate and update a quasi-3D crustal model of Brazil's territory.

  8. Study on intermediate frequency power supply automatic monitor system

    International Nuclear Information System (INIS)

    A new design project of the automatic monitor system for the intermediate frequency power supply system by using the communication server is put for- ward and the realizing principle method and the key technique are clarified in detail. This system made use of the conversion function with the series communication server's control, realized the data collecting function by the double machine backup and redundancy. The new network system adopted the photoelectric-insulated-communication connect device and the diagnosis technique, increased the anti-interference ability, the communication adopted the technique by the alarm information sending out in first and circularly repeating, the slowly speed is overcame in the original monitor network system, and strengthened the celerity of the monitor system and the reliability of the alarm report. After the new monitor system running, the result shows that the functions is more perfect than the original monitor system, the usage is more convenient, have the higher and dependable stability, the report of alarm is more quickly, and is convenient for the analysis after the trouble, at the same time, the system still have the strong ability and value to expand. (authors)

  9. 49 CFR Appendix H to Part 40 - DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form H Appendix H to Part 40 Transportation Office of the Secretary..., App. H Appendix H to Part 40—DOT Drug and Alcohol Testing Management Information System (MIS)...

  10. Automatic Waterjet Positioning Vision System

    OpenAIRE

    Dziak, Damian; Jachimczyk, Bartosz; Jagusiak, Tomasz

    2012-01-01

    The goals of this work are a design and implementation of a new vision system, integrated with the waterjet machine. This system combines two commercial webcams applied on an industrial dedicated platform. A main purpose of the vision system is to detect the position and rotation of a workpiece placed on the machine table. The used object recognition algorithm consists of edge detection, standard math processing functions and noise filters. The Hough transform technique is used to extract lin...

  11. The application of RTX51 tiny in nuclear data collection system

    International Nuclear Information System (INIS)

    This paper introduces a kind of nuclear multi-channel analyse system, which is γ energy spectrum data acquisition system with 512 channel and use P89C668 as the key microprocessor. The application of RTX51 Tiny in the design of nuclear multi-channel system greatly improve the system's stability and reliability, as the same time, the introduction of multi-task improve the software modularization and exploiture, cut down the complexity of software exploiture. (authors)

  12. Recent developments in the Los Alamos National Laboratory Plutonium Facility Waste Tracking System-automated data collection pilot project

    International Nuclear Information System (INIS)

    The waste management and environmental compliance group (NMT-7) at the Los Alamos National Laboratory has initiated a pilot project for demonstrating the feasibility and utility of automated data collection as a solution for tracking waste containers at the Los Alamos National Laboratory Plutonium Facility. This project, the Los Alamos Waste Tracking System (LAWTS), tracks waste containers during their lifecycle at the facility. LAWTS is a two-tiered system consisting of a server/workstation database and reporting engine and a hand-held data terminal-based client program for collecting data directly from tracked containers. New containers may be added to the system from either the client unit or from the server database. Once containers are in the system, they can be tracked through one of three primary transactions: Move, Inventory, and Shipment. Because LAWTS is a pilot project, it also serves as a learning experience for all parties involved. This paper will discuss many of the lessons learned in implementing a data collection system in the restricted environment. Specifically, the authors will discuss issues related to working with the PPT 4640 terminal system as the data collection unit. They will discuss problems with form factor (size, usability, etc.) as well as technical problems with wireless radio frequency functions. They will also discuss complications that arose from outdoor use of the terminal (barcode scanning failures, screen readability problems). The paper will conclude with a series of recommendations for proceeding with LAWTS based on experience to date

  13. Automatic Road Sign Inventory Using Mobile Mapping Systems

    Science.gov (United States)

    Soilán, M.; Riveiro, B.; Martínez-Sánchez, J.; Arias, P.

    2016-06-01

    The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS) use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS), thus, road sign information can be available to be used in a Smart City context.

  14. 29 CFR 42.21 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Data collection. 42.21 Section 42.21 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.21 Data collection. (a) For each protective statute, ESA... completed based on complaints. (g) The National Committee shall review the data collection systems of...

  15. Real time psychrometric data collection

    International Nuclear Information System (INIS)

    Eight Mine Weather Stations (MWS) installed at the Waste Isolation Pilot Plant (WIPP) to monitor the underground ventilation system are helping to simulate real-time ventilation scenarios. Seasonal weather extremes can result in variations of Natural Ventilation Pressure (NVP) which can significantly effect the ventilation system. The eight MWS(s) (which previously collected and stored temperature, barometric pressure and relative humidity data for subsequent NVP calculations) were upgraded to provide continuous real-time data to the site wide Central monitoring System. This data can now be utilized by the ventilation engineer to create realtime ventilation simulations and trends which assist in the prediction and mitigation of NVP and psychrometric related events

  16. Hindi Digits Recognition System on Speech Data Collected in Different Natural Noise Environments

    Directory of Open Access Journals (Sweden)

    Babita Saxena

    2015-02-01

    Full Text Available This paper presents a baseline digits speech recogn izer for Hindi language. The recording environment is different for all speakers, since th e data is collected in their respective homes. The different environment refers to vehicle horn no ises in some road facing rooms, internal background noises in some rooms like opening doors, silence in some rooms etc. All these recordings are used for training acoustic model. Th e Acoustic Model is trained on 8 speakers’ audio data. The vocabulary size of the recognizer i s 10 words. HTK toolkit is used for building acoustic model and evaluating the recognition rate of the recognizer. The efficiency of the recognizer developed on recorded data, is shown at the end of the paper and possible directions for future research work are suggested.

  17. Automatic Irrigation System using WSNs

    OpenAIRE

    Ravinder Singh Dhanoa1; Ravinder Singh

    2014-01-01

    During the entire, I went through various electronics equipment for the project. I learned about Controller 8051, Contact type sensors, Comparator and a little about other electrical equipments. Irrigation systems are as old as man itself since agriculture is the foremost occupation of civilized humanity. To irrigate large areas of plants is an onerous job. In order to overcome this problem many irrigation scheduling techniques have been developed which are mainly based on mon...

  18. Wireless System and Method for Collecting Motion and Non-Motion Related Data of a Rotating System

    Science.gov (United States)

    Woodard, Stanley E. (Inventor); Taylor, Bryant D. (Inventor)

    2011-01-01

    A wireless system for collecting data indicative of a tire's characteristics uses at least one open-circuit electrical conductor in a tire. The conductor is shaped such that it can store electrical and magnetic energy. In the presence of a time-varying magnetic field, the conductor resonates to generate a harmonic response having a frequency, amplitude and bandwidth. A magnetic field response recorder is used to (i) wirelessly transmit the time-varying magnetic field to the conductor, and (ii) wirelessly detect the harmonic response and the frequency, amplitude and bandwidth, associated therewith. The recorder is adapted to be positioned in a location that is fixed with respect to the tire as the tire rotates.

  19. All-optical automatic pollen identification: Towards an operational system

    Science.gov (United States)

    Crouzy, Benoît; Stella, Michelle; Konzelmann, Thomas; Calpini, Bertrand; Clot, Bernard

    2016-09-01

    We present results from the development and validation campaign of an optical pollen monitoring method based on time-resolved scattering and fluorescence. Focus is first set on supervised learning algorithms for pollen-taxa identification and on the determination of aerosol properties (particle size and shape). The identification capability provides a basis for a pre-operational automatic pollen season monitoring performed in parallel to manual reference measurements (Hirst-type volumetric samplers). Airborne concentrations obtained from the automatic system are compatible with those from the manual method regarding total pollen and the automatic device provides real-time data reliably (one week interruption over five months). In addition, although the calibration dataset still needs to be completed, we are able to follow the grass pollen season. The high sampling from the automatic device allows to go beyond the commonly-presented daily values and we obtain statistically significant hourly concentrations. Finally, we discuss remaining challenges for obtaining an operational automatic monitoring system and how the generic validation environment developed for the present campaign could be used for further tests of automatic pollen monitoring devices.

  20. Learning Diagnostic Diagrams in Transport-Based Data-Collection Systems

    DEFF Research Database (Denmark)

    Tran, Vu The; Eklund, Peter; Cook, Chris

    Insights about service improvement in a transit network can be gained by studying transit service reliability. In this paper, a general procedure for constructing a transit service reliability diagnostic (Tsrd) diagram based on a Bayesian network is proposed to automatically build a behavioural m...

  1. Automatic liquid-liquid extraction system

    International Nuclear Information System (INIS)

    This invention concerns an automatic liquid-liquid extraction system ensuring great reproducibility on a number of samples, stirring and decanting of the two liquid phases, then the quantitative removal of the entire liquid phase present in the extraction vessel at the end of the operation. This type of system has many applications, particularly in carrying out analytical processes comprising a stage for the extraction, by means of an appropriate solvent, of certain components of the sample under analysis

  2. Automatic remote correcting system for MOOCS

    OpenAIRE

    Rochat, Pierre-Yves

    2014-01-01

    An automatic correcting system was designed to be able to correct the programming exercises during a Massive Open Online Course (MOOC) about Microcontrollers, followed by thousands of students. Build around the MSP430G Launchpad, it has corrected more then 30'000 submissions in 7 weeks. This document provides general information about the system, the results obtained during a MOOC on the Coursera.org plateform, extensions done to remote experiences and future projects.

  3. Visual Algorithm Simulation Exercise System with Automatic Assessment: TRAKLA2

    Directory of Open Access Journals (Sweden)

    Lauri MALMI

    2004-10-01

    Full Text Available Interaction and feedback are key factors supporting the learning process. Therefore many automatic assessment and feedback systems have been developed for computer science courses during the past decade. In this paper we present a new framework, TRAKLA2, for building interactive algorithm simulation exercises. Exercises constructed in TRAKLA2 are viewed as learning objects in which students manipulate conceptual visualizations of data structures in order to simulate the working of given algorithms. The framework supports randomized input values for the assignments, as well as automatic feedback and grading of students' simulation sequences. Moreover, it supports automatic generation of model solutions as algorithm animations and the logging of statistical data about the interaction process resulting as students solve exercises. The system has been used in two universities in Finland for several courses involving over 1000 students. Student response has been very positive.

  4. Gamma-ray spectrometry data collection and reduction by simple computing systems.

    Science.gov (United States)

    Op de Beeck, J

    1975-12-01

    The review summarizes the present state of the involvement of relatively small computing devices in the collection and processing of gamma-ray spectrum data. An economic and utilitarian point of view has been chosen with regard to data collection in order to arrive at practically valuable conclusions in terms of feasibility of possible configurations with respect to their eventual application. A unified point of view has been adopted with regard to data processing by developing an information theoretical approach on a more or less intuitive level in an attempt to remove the largest part of the virtual disparity between the several processing methods described in the literature. A synoptical introduction to the most important mathematical methods has been incorporated, together with a detailed theoretical description of the concept gamma-ray spectrum. In accordance with modern requirements, the discussions are mainly oriented towards high-resolution semiconductor detector-type spectra. The critical evaluation of the processing methods reviewed is done with respect to a set of predefined criteria. Smoothing, peak detection, peak intensity determination, overlapping peak resolving and detection and upper limits are discussed in great detail. A preferred spectrum analysis method combining powerful data reduction properties with extreme simplicity and speed of operation is suggested. The general discussion is heavily oriented towards activation analysis application, but other disciplines making use of gamma-ray spectrometry will find the material presented equally useful. Final conclusions are given pointing to future developments and shifting their centre of gravity towards improving the quality of the measurements rather than expanding the use of tedious and sophisticated mathematical techniques requiring the limits of available computational power. PMID:769794

  5. Gamma-ray spectrometry data collection and reduction by simple computing systems

    International Nuclear Information System (INIS)

    The review summarizes the present state of the involvement of relatively small computing devices in the collection and processing of gamma-ray spectrum data. An economic and utilitarian point of view has been chosen with regard to data collection in order to arrive at practically valuable conclusions in terms of feasibility of possible configurations with respect to their eventual application. A unified point of view has been adopted with regard to data processing by developing an information theoretical approach on a more or less intuitive level in an attempt to remove the largest part of the virtual disparity between the several processing methods described in the literature. A synoptical introduction to the most important mathematical methods has been incorporated, together with a detailed theoretical description of the concept gamma-ray spectrum. In accordance with modern requirements, the discussions are mainly oriented towards high-resolution semiconductor detector-type spectra. The critical evaluation of the processing methods reviewed is done with respect to a set of predefined criteria. Smoothing, peak detection, peak intensity determination, overlapping peak resolving and detection and upper limits are discussed in great detail. A preferred spectrum analysis method combining powerful data reduction properties with extreme simplicity and speed of operation is suggested. The general discussion is heavily oriented towards activation analysis application, but other disciplines making use of gamma-ray spectrometry will find the material presented equally useful. Final conclusions are given pointing to future developments and shifting their centre of gravity towards improving the quality of the measurements rather than expanding the use of tedious and sophisticated mathematical techniques requiring the limits of available computational power. (author)

  6. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  7. A Risk Assessment System with Automatic Extraction of Event Types

    Science.gov (United States)

    Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula

    In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.

  8. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    trials of cross-validation. Second, we test the robustness of each system to spectral equalization. Finally, we test how well human subjects recognize the genres of music excerpts composed by each system to be highly genre representative. Our results suggest that neither high-performing system has a......We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple...

  9. 15 CFR 911.4 - Use of the NOAA Data Collection Systems.

    Science.gov (United States)

    2010-01-01

    ..., system access mode, and, in the case of government agencies, cost-effectiveness. (c)(1) Except as... an operational nature, and to those requiring the unique capabilities of the Argos DCS, such as...) of this section must be based on such factors as satellite coverage, accuracy, data...

  10. Challenges in Developing Data Collection Systems in a Rapidly Evolving Higher Education Environment

    Science.gov (United States)

    Borden, Victor M. H.; Calderon, Angel; Fourie, Neels; Lepori, Benedetto; Bonaccorsi, Andrea

    2013-01-01

    Efforts to develop common higher education data standards are expanding both within and across countries. The institutional research (IR) community plays a critical role in assuring that these efforts capture the diverse manifestations of the postsecondary and tertiary education systems and promote responsible comparisons. This chapter provides…

  11. RFID: A Revolution in Automatic Data Recognition

    Science.gov (United States)

    Deal, Walter F., III

    2004-01-01

    Radio frequency identification, or RFID, is a generic term for technologies that use radio waves to automatically identify people or objects. There are several methods of identification, but the most common is to store a serial number that identifies a person or object, and perhaps other information, on a microchip that is attached to an antenna…

  12. Automatic programmers for solid set sprinkler irrigation systems

    OpenAIRE

    Zapata Ruiz, Nery; Salvador Esteban, Raquel; Cavero Campo, José; Lecina Brau, Sergio; Playán Jubillar, Enrique

    2012-01-01

    Introduction: The application of new technologies to the control and automation of irrigation processes is becoming very important in the last decade. Although automation of irrigation execution (irrigation programmers) is now widespread the automatic generation and execution of irrigation schedules is receiving growing attention due to the possibilities offered by the telemetry / remote control systems currently being installed in collective pressurized networks. In this paper, a protot...

  13. Reliability data collection in the UK: the NCSR scheme

    International Nuclear Information System (INIS)

    Some of the general aspects of reliability data collection are discussed and illustrated with an example of a recent Safety and Reliability Systems (SRS) project concerned with high economic risk in the field of advanced manufacturing technology. Any data collection scheme must aim to provide the information required to enable the correct decisions to be taken in order to reach specified objectives. These objectives should be well defined and documented at the outset. In the current context they are improvements in the reliability of complex technical systems to improve economy of operation and/or safety. It may be for existing plant or for proposed systems in the design stage. The case study chosen contained examples of plant equipment in fuel reprocessing and active handling lines, for example robots, computer controlled cutting tools and automatic guided vehicles. New classification methods and the integration of several different computer applications packages were needed. (author)

  14. Should mortality data for the elderly be collected routinely in emergencies? The practical challenges of age-disaggregated surveillance systems.

    Science.gov (United States)

    du Cros, Philipp; Venis, Sarah; Karunakara, Unni

    2013-11-01

    Data on the elderly are rarely collected in humanitarian emergencies. During a refugee crisis in South Sudan, Médecins Sans Frontières developed a prospective mortality surveillance system collecting data for those aged ≥50 years and found that the elderly were dying at five times the rate of those aged 5-49 years. Practical and ethical issues arose. Were reported ages accurate? Since no baseline exists, what does the mortality rate mean? Should programmatic changes be made without evidence that these would reduce the elderly mortality rate? We outline issues to be addressed to enable informed decisions on response to elderly populations in emergency settings. PMID:24114674

  15. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  16. Making sense of the shadows: priorities for creating a learning healthcare system based on routinely collected data.

    Science.gov (United States)

    Deeny, Sarah R; Steventon, Adam

    2015-08-01

    Socrates described a group of people chained up inside a cave, who mistook shadows of objects on a wall for reality. This allegory comes to mind when considering 'routinely collected data'-the massive data sets, generated as part of the routine operation of the modern healthcare service. There is keen interest in routine data and the seemingly comprehensive view of healthcare they offer, and we outline a number of examples in which they were used successfully, including the Birmingham OwnHealth study, in which routine data were used with matched control groups to assess the effect of telephone health coaching on hospital utilisation.Routine data differ from data collected primarily for the purposes of research, and this means that analysts cannot assume that they provide the full or accurate clinical picture, let alone a full description of the health of the population. We show that major methodological challenges in using routine data arise from the difficulty of understanding the gap between patient and their 'data shadow'. Strategies to overcome this challenge include more extensive data linkage, developing analytical methods and collecting more data on a routine basis, including from the patient while away from the clinic. In addition, creating a learning health system will require greater alignment between the analysis and the decisions that will be taken; between analysts and people interested in quality improvement; and between the analysis undertaken and public attitudes regarding appropriate use of data. PMID:26065466

  17. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    Science.gov (United States)

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  18. DATA INFORMATION SYSTEM TO PROMOTE THE ORGANIZATION DATA OF COLLECTIONS – MODELING CONSIDERATIONS BY THE UNIFIED MODELIGN LANGUAGE (UML

    Directory of Open Access Journals (Sweden)

    Eduardo Batista de Moraes Barbosa

    2011-05-01

    Full Text Available It can be argued that technological developments (e.g., measuring instruments like software, satellite and computers, as well as, the cheapening of storage media allow organizations to produce and acquire a great amount of data in a short time. Due to the data volume, research organizations become potentially vulnerable to the information explosion impacts. An adopted solution is the use of information system tools to assist data documentation, retrieval and analysis. In the scientific scope, these tools are developed to store different metadata (data about data patterns. During the development process of these tools, the adoption of standards such as the Unified Modeling Language (UML stands out, whose diagrams assist the different scopes of software modeling. The objective of this study is to present an information system tool that assists organizations in the data documentation through the use of metadata and that highlights the software modeling process, through the UML. The Standard for Digital Geospatial Metadata will be approached, widely used to the dataset cataloging by scientific organizations around the world, and the dynamic and static UML diagrams like use cases, sequence and classes. The development of the information system tools can be a way to promote the scientific data organization and dissemination. However, the modeling process requires special attention during the development of interfaces that will stimulate the use of the information system tools

  19. A Framework for Detecting Fraudulent Activities in EDO State Tax Collection System Using Investigative Data Mining

    Directory of Open Access Journals (Sweden)

    Okoro F. M

    2016-05-01

    Full Text Available Edo State Inland Revenue Services is overwhelmed with gigabyte of disk capacity containing data about tax payers’ in the state. The data stored on the database increases in size at an alarming rate. This has resulted in a data rich but information poor situation where there is a widening gap between the explosive growth of data and its types, and the ability to analyze and interpret it effectively; hence the need for a new generation of automated and intelligent tools and techniques known as investigative data mining, to look for patterns in data. These patterns can lead to new insights, competitive advantages for business, and tangible benefits for the State Revenue services. This research work focuses on designing effective fraud detection and deterring architecture using investigative data mining technique. The proposed system architecture is designed to reason using Artificial Neural Network and Machine learning algorithm in order to detect and deter fraudulent activities. We recommend that the architectural framework be developed using Object Oriented Programming and Agent Oriented Programming Languages.

  20. Automatic data processing of nondestructive testing results

    International Nuclear Information System (INIS)

    The ADP system for the documentation of inservice inspection results of nuclear power plants is described. The same system can be used during the whole operational life time of the plant. To make this possible the ADP system has to be independent of the type of hardware, data recording and software. The computer programs are made using Fortran IV programming language. The results of nondestructive testing are recorded in an inspection register by ADP methods. Different outputs can be utilized for planning, performance and reporting of inservice inspections. (author)

  1. An Automatic Indirect Immunofluorescence Cell Segmentation System

    OpenAIRE

    2014-01-01

    Indirect immunofluorescence (IIF) with HEp-2 cells has been used for the detection of antinuclear autoantibodies (ANA) in systemic autoimmune diseases. The ANA testing allows us to scan a broad range of autoantibody entities and to describe them by distinct fluorescence patterns. Automatic inspection for fluorescence patterns in an IIF image can assist physicians, without relevant experience, in making correct diagnosis. How to segment the cells from an IIF image is essential in developing an...

  2. Kerman Photovoltaic Power Plant R&D data collection computer system operations and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, P.B.

    1994-06-01

    The Supervisory Control and Data Acquisition (SCADA) system at the Kerman PV Plant monitors 52 analog, 44 status, 13 control, and 4 accumulator data points in real-time. A Remote Terminal Unit (RTU) polls 7 peripheral data acquisition units that are distributed throughout the plant once every second, and stores all analog, status, and accumulator points that have changed since the last scan. The R&D Computer, which is connected to the SCADA RTU via a RS-232 serial link, polls the RTU once every 5-7 seconds and records any values that have changed since the last scan. A SCADA software package called RealFlex runs on the R&D computer and stores all updated data values taken from the RTU, along with a time-stamp for each, in a historical real-time database. From this database, averages of all analog data points and snapshots of all status points are generated every 10 minutes and appended to a daily file. These files are downloaded via modem by PVUSA/Davis staff every day, and the data is placed into the PVUSA database.

  3. From Automatic to Adaptive Data Acquisition

    DEFF Research Database (Denmark)

    Chang, Marcus

    2009-01-01

    Sensornets have been used for ecological monitoring the past decade, yet the main driving force behind these deployments are still computer scien- tists. The denser sampling and added modalities oered by sensornets could drive these elds in new directions, but not until the domain scientists be......- come familiar with sensornets and use them as any other instrument in their toolbox. We explore three dierent directions in which sensornets can become easier to deploy, collect data of higher quality, and oer more exibility, and we postulate that sensornets should be instruments for domain scientists....... As a tool to ease designing and deploying sensornets, we developed a method- ology to characterize mote performance and predict the resource consumption for applications on dierent platforms, without actually having to execute them. This enables easy comparison of dierent platforms. In order to reduce...

  4. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  5. An Automatic Indirect Immunofluorescence Cell Segmentation System

    Directory of Open Access Journals (Sweden)

    Yung-Kuan Chan

    2014-01-01

    Full Text Available Indirect immunofluorescence (IIF with HEp-2 cells has been used for the detection of antinuclear autoantibodies (ANA in systemic autoimmune diseases. The ANA testing allows us to scan a broad range of autoantibody entities and to describe them by distinct fluorescence patterns. Automatic inspection for fluorescence patterns in an IIF image can assist physicians, without relevant experience, in making correct diagnosis. How to segment the cells from an IIF image is essential in developing an automatic inspection system for ANA testing. This paper focuses on the cell detection and segmentation; an efficient method is proposed for automatically detecting the cells with fluorescence pattern in an IIF image. Cell culture is a process in which cells grow under control. Cell counting technology plays an important role in measuring the cell density in a culture tank. Moreover, assessing medium suitability, determining population doubling times, and monitoring cell growth in cultures all require a means of quantifying cell population. The proposed method also can be used to count the cells from an image taken under a fluorescence microscope.

  6. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author)

  7. Recent advances in automatic alignment system for the National Iginition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, K; Awwal, A; Kalantar, D; Leach, R; Lowe-Webb, R; McGuigan, D; Kamm, V

    2010-12-08

    The automatic alignment system for the National Ignition Facility (NIF) is a large-scale parallel system that directs all 192 laser beams along the 300-m optical path to a 50-micron focus at target chamber in less than 50 minutes. The system automatically commands 9,000 stepping motors to adjust mirrors and other optics based upon images acquired from high-resolution digital cameras viewing beams at various locations. Forty-five control loops per beamline request image processing services running on a LINUX cluster to analyze these images of the beams and references, and automaticallys teer the beams toward the target. This paper discusses the upgrades to the NIF automatic alignment system to handle new alignment needs and evolving requirements as related to various types of experiments performed. As NIF becomes a continuously-operated system and more experiments are performed, performance monitoring is increasingly important for maintenance and commissioning work. Data, collected during operations, is analyzed for tuning of the laser and targeting maintenance work. handling evolving alignment and maintenance needs is expected for the planned 30-year operational life of NIF.

  8. Design and microfabrication of new automatic human blood sample collection and preparation devices

    OpenAIRE

    Tran, Minh Nhut

    2015-01-01

    For self-sampling or collection of blood by health personal related to point-ofcare diagnostics in health rooms, it may often be necessary to perform automatic collection of blood samples. The most important operation that needs to be done when handling whole blood is to be able to combine automatic sample collection with optimal mixing of anticoagulation liquid and weak xatives. In particular before doing any transport of a sample or point-of-care nucleic acid diagnostics (PO...

  9. A Robot Based Automatic Paint Inspection System

    Science.gov (United States)

    Atkinson, R. M.; Claridge, J. F.

    1988-06-01

    The final inspection of manufactured goods is a labour intensive activity. The use of human inspectors has a number of potential disadvantages; it can be expensive, the inspection standard applied is subjective and the inspection process can be slow compared with the production process. The use of automatic optical and electronic systems to perform the inspection task is now a growing practice but, in general, such systems have been applied to small components which are accurately presented. Recent advances in vision systems and robot control technology have made possible the installation of an automated paint inspection system at the Austin Rover Group's plant at Cowley, Oxford. The automatic inspection of painted car bodies is a particularly difficult problem, but one which has major benefits. The pass line of the car bodies is ill-determined, the surface to be inspected is of varying surface geometry and only a short time is available to inspect a large surface area. The benefits, however, are due to the consistent standard of inspection which should lead to lower levels of customer complaints and improved process feedback. The Austin Rover Group initiated the development of a system to fulfil this requirement. Three companies collaborated on the project; Austin Rover itself undertook the production line modifications required for body presentation, Sira Ltd developed the inspection cameras and signal processing system and Unimation (Europe) Ltd designed, supplied and programmed the robot system. Sira's development was supported by a grant from the Department of Trade and Industry.

  10. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  11. 实时采集中异常值的自动甄别与纠错方法研究%Study of method of automatically abnormalvalue discrimination and error correction in real-time data acquisition system

    Institute of Scientific and Technical Information of China (English)

    王永国; 胡娇娇; 季学枫

    2013-01-01

      随着科学技术进步,规模猪的生长过程研究手段也日益现代化。传统上对猪的生长研究大多采用人工收集数据,不仅麻烦费事,而且极易产生猪的应激反应,对猪的生长产生影响。随着各种传感器和现代通讯技术在养猪事业中的应用,数据的收集变得更加科学、方便。然而,由于采集数据对象的特殊性,使得在采集数据的过程中,客观存在挤、拱、撞等现象,从而造成采集的数据存在偏差,对后期分析研究猪的生长性状产生一定影响,因此,必需加以修正。鉴于此提出了将经典算法与神经网络方法相结合来自动甄别与纠正采集的数据,通过 Matlab 仿真及在安徽菩提果公司研发的“9SC-05猪用选种选料自动测定设备系统”的应用实践,表明该方法具有纠错准确率高、速度快、适应性好等优点。%With the progress of science and technology, the research method of scale pig’s growth process is also increasingly modern. Traditionally, manual methods of data collection are mostly used to study the pig’s growth, which are not only trouble-some, but also easily cause pig’s stress reaction, thus affect the normal growth of pig. With the application of various sensors and modern communication technology in pig industry, the data collection methods are becoming more and more scientific and convenient. However, due to the particularity of collect objects, there objectively exist all kinds of phenomenons, such as crowd-ing, arching and colliding, in the process of collecting data, which may make the data misaligned and subsequent analysis and research in growth characteristics of pig influenced, therefore, the data collected must be corrected. So, this paper puts forward the classical algorithm combined with neural network to automatically identify and correct the data collected. Through the simu-lation of Matlab and Anhui bodhi fruits company

  12. Towards Automatic Capturing of Manual Data Processing Provenance

    OpenAIRE

    Wombacher, Andreas; Huq, Mohammad R.

    2011-01-01

    Often data processing is not implemented by a work ow system or an integration application but is performed manually by humans along the lines of a more or less specified procedure. Collecting provenance information during manual data processing can not be automated. Further, manual collection of provenance information is error prone and time consuming. Therefore, we propose to infer provenance information based on the read and write access of users. The derived provenance information is comp...

  13. The automatic system of the personal dose control

    International Nuclear Information System (INIS)

    Full text: The automatic system of the personal dose control (ASPDC) is the subsystem of the automatic radiation control system for nuclear power plant (NPP). The goal of ASPDC is to provide complete on-line control of the internal and external occupational exposure that is necessary to determine the dependence of the NPP personnel's health on the accumulated dose and to demonstrate safety and controllability of ionizing radiation sources. This goal is achieved by implementing three types of monitoring in ASPDC: current, cumulative and emergency monitoring. The dose of external exposure is measured with electronic personal dosimeters -AT-2503 (dose equivalent Hp(10) and dose rate equivalent of x-rays and gamma-rays) and with thermo luminescent dosimeters PDC-301). The dose of internal exposure is determined using whole-body counter AT-1316. Current monitoring allows controlling the exposure dose per one visit to the restricted-access area (RAA). The exposure dose gained since the beginning of the current year is monitored on entrance to the RAA. The dose and dose rate of exposure are recorded during working and on exit of the RAA. The exposure dose accumulated during the stay in the RAA is also recorded upon exiting the RAA. Electronic dosimeters AT-2503 are used for the on-line monitoring. This dosimeters feature digital display, audio and visual alarms, and allow minimizing the occupational exposure. Coupled with reading devices and a personal computer, they form an automated workstation AW AT-2503. Cumulative monitoring enables controlling the cumulative exposure dose accrued during the current year. It considers both internal and external exposure, and monitors neutron, beta and gamma irradiation. A system of passive thermo luminescent dosimeters PDC-301 and a whole- body counter AT-1316 are used to perform cumulative monitoring. Emergency monitoring facilitates exposure dose control during hazardous or recovery activities at NPP. The notable features of ASPDC

  14. Data Collection and New Technology

    OpenAIRE

    Olubunmi Philip Aborisade

    2013-01-01

    Interview has become a popular method of data collection in qualitative research. This article examines the different interview methods for collecting data (e.g., structured interviews, group interviews, unstructured, etc.), as well as the various methods for analyzing interview data (e.g., interpretivism, social anthropology, collaborative social research). It also evaluates the interview types and analysis methods in qualitative research and the new technology for conducting interviews such...

  15. The house keeping data acquisition system in space detection

    International Nuclear Information System (INIS)

    The house keeping data acquisition system in space detection is introduced. It is based on micro-controller 80C196. The system can automatically collect the data such as the temperature, high voltage power supply and the events counter, and communication data with 1553B bus interface

  16. Data collection and examinations supporting assessment of factors inducing cracking in pressurized components of LWR systems

    International Nuclear Information System (INIS)

    The major objective of the project is to bring up to date and further verify existing information and knowledge of factors and conditions inducing cracking (strain-induced crack corrosion, stress crack corrosion, fatigue). For this purpose, systematic examinations have been carried out, and available results of inspections and in-service performance data of components of relevance to nuclear power plant safety have been reviewed. With respect to stress crack corrosion sensitivity and safety of reactor pressure vessel steels used in BWR systems, a major task was to evaluate and comment on an existing VGB report dealing with these aspects. The results of this review show unrestricted agreement with the general statement of the VGB report. (orig./CB)

  17. Automatic focusing system of BSST in Antarctic

    Science.gov (United States)

    Tang, Peng-Yi; Liu, Jia-Jing; Zhang, Guang-yu; Wang, Jian

    2015-10-01

    Automatic focusing (AF) technology plays an important role in modern astronomical telescopes. Based on the focusing requirement of BSST (Bright Star Survey Telescope) in Antarctic, an AF system is set up. In this design, functions in OpenCV is used to find stars, the algorithm of area, HFD or FWHM are used to degree the focus metric by choosing. Curve fitting method is used to find focus position as the method of camera moving. All these design are suitable for unattended small telescope.

  18. Automatic Battery Swap System for Home Robots

    OpenAIRE

    Juan Wu; Guifang Qiao; Jian Ge; Hongtao Sun; Guangming Song

    2012-01-01

    This paper presents the design and implementation of an automatic battery swap system for the prolonged activities of home robots. A battery swap station is proposed to implement battery off‐line recharging and on‐line exchanging functions. It consists of a loading and unloading mechanism, a shifting mechanism, a locking device and a shell. The home robot is a palm‐sized wheeled robot with an onboard camera and a removable battery case in the front. It communicates with the battery swap stati...

  19. Automatic system for detecting pornographic images

    Science.gov (United States)

    Ho, Kevin I. C.; Chen, Tung-Shou; Ho, Jun-Der

    2002-09-01

    Due to the dramatic growth of network and multimedia technology, people can more easily get variant information by using Internet. Unfortunately, it also makes the diffusion of illegal and harmful content much easier. So, it becomes an important topic for the Internet society to protect and safeguard Internet users from these content that may be encountered while surfing on the Net, especially children. Among these content, porno graphs cause more serious harm. Therefore, in this study, we propose an automatic system to detect still colour porno graphs. Starting from this result, we plan to develop an automatic system to search porno graphs or to filter porno graphs. Almost all the porno graphs possess one common characteristic that is the ratio of the size of skin region and non-skin region is high. Based on this characteristic, our system first converts the colour space from RGB colour space to HSV colour space so as to segment all the possible skin-colour regions from scene background. We also apply the texture analysis on the selected skin-colour regions to separate the skin regions from non-skin regions. Then, we try to group the adjacent pixels located in skin regions. If the ratio is over a given threshold, we can tell if the given image is a possible porno graph. Based on our experiment, less than 10% of non-porno graphs are classified as pornography, and over 80% of the most harmful porno graphs are classified correctly.

  20. Water Column Sonar Data Collection

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The collection and analysis of water column sonar data is a relatively new avenue of research into the marine environment. Primary uses include assessing biological...

  1. Distributed privacy preserving data collection

    KAUST Repository

    Xue, Mingqiang

    2011-01-01

    We study the distributed privacy preserving data collection problem: an untrusted data collector (e.g., a medical research institute) wishes to collect data (e.g., medical records) from a group of respondents (e.g., patients). Each respondent owns a multi-attributed record which contains both non-sensitive (e.g., quasi-identifiers) and sensitive information (e.g., a particular disease), and submits it to the data collector. Assuming T is the table formed by all the respondent data records, we say that the data collection process is privacy preserving if it allows the data collector to obtain a k-anonymized or l-diversified version of T without revealing the original records to the adversary. We propose a distributed data collection protocol that outputs an anonymized table by generalization of quasi-identifier attributes. The protocol employs cryptographic techniques such as homomorphic encryption, private information retrieval and secure multiparty computation to ensure the privacy goal in the process of data collection. Meanwhile, the protocol is designed to leak limited but non-critical information to achieve practicability and efficiency. Experiments show that the utility of the anonymized table derived by our protocol is in par with the utility achieved by traditional anonymization techniques. © 2011 Springer-Verlag.

  2. 24 CFR 902.60 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Data collection. 902.60 Section 902.60 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.60 Data collection. (a) Fiscal Year reporting...

  3. 20 CFR 653.109 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Data collection. 653.109 Section 653.109 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR SERVICES OF THE EMPLOYMENT SERVICE SYSTEM Services for Migrant and Seasonal Farmworkers (MSFWs) § 653.109 Data collection....

  4. 34 CFR 303.540 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.540 Section 303.540 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... DISABILITIES State Administration Reporting Requirements § 303.540 Data collection. (a) Each system...

  5. Automatic Railway Power Line Extraction Using Mobile Laser Scanning Data

    Science.gov (United States)

    Zhang, Shanxin; Wang, Cheng; Yang, Zhuang; Chen, Yiping; Li, Jonathan

    2016-06-01

    Research on power line extraction technology using mobile laser point clouds has important practical significance on railway power lines patrol work. In this paper, we presents a new method for automatic extracting railway power line from MLS (Mobile Laser Scanning) data. Firstly, according to the spatial structure characteristics of power-line and trajectory, the significant data is segmented piecewise. Then, use the self-adaptive space region growing method to extract power lines parallel with rails. Finally use PCA (Principal Components Analysis) combine with information entropy theory method to judge a section of the power line whether is junction or not and which type of junction it belongs to. The least squares fitting algorithm is introduced to model the power line. An evaluation of the proposed method over a complicated railway point clouds acquired by a RIEGL VMX450 MLS system shows that the proposed method is promising.

  6. QCS: Driving automatic data analysis programs for TFTR

    International Nuclear Information System (INIS)

    QCS (Queue Control System) executes on the VAX Cluster, driving programs which provide automatic analysis of per-shot data for the TFTR experiment at PPPL. QCS works in conjunction with site-specific programs to provide these noninteractive user programs with shot numbers for which all necessary conditions have been satisfied to permit processing. A typical condition is the existence of a particular data file or set of files for a shot. The user provides a boolean expression of the conditions upon which a shot number should be entered into a private Queue. The user program requests a ''ready-to-process'' shot number through a call to a specially provided function. If the specified Queue is empty, the program hibernates until another shot number is available

  7. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  8. Development of optical automatic positioning and wafer defect detection system

    International Nuclear Information System (INIS)

    The data of a wafer with defects can provide engineers with very important information and clues to improve the yield rate and quality in manufacturing. This paper presents a microscope automatic positioning and wafer detection system with human-machine interface based on image processing and fuzzy inference algorithms. In the proposed system, a XY table is used to move the position of each die on 6 inch or 8 inch wafers. Then, a high-resolution CCD and one set of two-axis optical linear encoder are used to accurately measure the position on the wafer. Finally, the developed human-machine interface is used to display the current position of an actual wafer in order to complete automatic positioning, and a wafer map database can be created. In the process of defect detection, CCD is used for image processing, and during preprocessing, it is required to filter noise, acquire the defect characteristics, define the defective template, and then take the characteristic points of the defective template as the reference input for fuzzy inference. A high-accuracy optical automatic positioning and wafer defect detection system is thus constructed. This study focused on automatic detection of spots, scratches, and bruises, and attempted to reduce the time to detect defective die and improve the accuracy of determining the defects of semiconductor devices. (paper)

  9. Development of optical automatic positioning and wafer defect detection system

    Science.gov (United States)

    Tien, Chuen-Lin; Lai, Qun-Huang; Lin, Chern-Sheng

    2016-02-01

    The data of a wafer with defects can provide engineers with very important information and clues to improve the yield rate and quality in manufacturing. This paper presents a microscope automatic positioning and wafer detection system with human-machine interface based on image processing and fuzzy inference algorithms. In the proposed system, a XY table is used to move the position of each die on 6 inch or 8 inch wafers. Then, a high-resolution CCD and one set of two-axis optical linear encoder are used to accurately measure the position on the wafer. Finally, the developed human-machine interface is used to display the current position of an actual wafer in order to complete automatic positioning, and a wafer map database can be created. In the process of defect detection, CCD is used for image processing, and during preprocessing, it is required to filter noise, acquire the defect characteristics, define the defective template, and then take the characteristic points of the defective template as the reference input for fuzzy inference. A high-accuracy optical automatic positioning and wafer defect detection system is thus constructed. This study focused on automatic detection of spots, scratches, and bruises, and attempted to reduce the time to detect defective die and improve the accuracy of determining the defects of semiconductor devices.

  10. Using a participatory evaluation design to create an online data collection and monitoring system for New Mexico's Community Health Councils.

    Science.gov (United States)

    Andrews, M L; Sánchez, V; Carrillo, C; Allen-Ananins, B; Cruz, Y B

    2014-02-01

    We present the collaborative development of a web-based data collection and monitoring plan for thirty-two county councils within New Mexico's health council system. The monitoring plan, a key component in our multiyear participatory statewide evaluation process, was co-developed with the end users: representatives of the health councils. Guided by the Institute of Medicine's Community, Health Improvement Process framework, we first developed a logic model that delineated processes and intermediate systems-level outcomes in council development, planning, and community action. Through the online system, health councils reported data on intermediate outcomes, including policy changes and funds leveraged. The system captured data that were common across the health council system, yet was also flexible so that councils could report their unique accomplishments at the county level. A main benefit of the online system was that it provided the ability to assess intermediate, outcomes across the health council system. Developing the system was not without challenges, including creating processes to ensure participation across a large rural state; creating shared understanding of intermediate outcomes and indicators; and overcoming technological issues. Even through the challenges, however, the benefits of committing to using participatory processes far outweighed the challenges. PMID:24184843

  11. AUTOMATIC CONTROL SYSTEM FOR TORQUE NATIONAL STANDARD

    Directory of Open Access Journals (Sweden)

    J. Galván-Mancilla

    2004-04-01

    Full Text Available The continuous development of the technology and the increase of its complexity demand wider measurementintervals, a greater exactness and a greater diversity of the standards used in order to establish the units ormeasuring systems. Torque metrology is of great importance and a magnitude is of common use for industry,technical development and research. The realization, quantification and dissemination of this magnitude are tasksassigned to the Metrology National Center (CENAM Torque Laboratory, in Mexico.For the dissemination of this magnitude the Torque National Standard relies on a system, which, in its originaldesign, was operated manually originating high consumption of man-hours in the development of a calibration.This work presents the standard automation and the benefits of the automatic control system.

  12. Collection of Cancer Stage Data by Classifying Free-text Medical Reports

    OpenAIRE

    McCowan, Iain A.; Moore, Darren C.; Nguyen, Anthony N.; Bowman, Rayleen V; Clarke, Belinda E.; Duhig, Edwina E.; Fry, Mary-Jane

    2007-01-01

    Cancer staging provides a basis for planning clinical management, but also allows for meaningful analysis of cancer outcomes and evaluation of cancer care services. Despite this, stage data in cancer registries is often incomplete, inaccurate, or simply not collected. This article describes a prototype software system (Cancer Stage Interpretation System, CSIS) that automatically extracts cancer staging information from medical reports. The system uses text classification techniques to train s...

  13. Modeling of a data exchange process in the Automatic Process Control System on the base of the universal SCADA-system

    Science.gov (United States)

    Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.

    2016-04-01

    In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.

  14. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination

  15. Quality assurance for screening mammography data collection systems in 22 countries.

    NARCIS (Netherlands)

    Klabunde, C.N.; Sancho-Garnier, H.; Broeders, M.E.A.C.; Thoresen, S.; Rodrigues, V.J.; Ballard-Barbash, R.

    2001-01-01

    OBJECTIVES: To document the mammography data that are gathered by the organized screening programs participating in the International Breast Cancer Screening Network (IBSN), the nature of their procedures for data quality assurance, and the measures used to assess program performance and impact. MET

  16. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  17. Data Collection and New Technology

    Directory of Open Access Journals (Sweden)

    Olubunmi Philip Aborisade

    2013-05-01

    Full Text Available Interview has become a popular method of data collection in qualitative research. This article examines the different interview methods for collecting data (e.g., structured interviews, group interviews, unstructured, etc., as well as the various methods for analyzing interview data (e.g., interpretivism, social anthropology, collaborative social research. It also evaluates the interview types and analysis methods in qualitative research and the new technology for conducting interviews such as e-mail, telephone, skype, webcam, Facebook chat etc to ascertain how they limit interviewees from giving full picture of moral and ethical Issues.

  18. 76 FR 58301 - Proposed Extension of Existing Information Collection; Automatic Fire Sensor and Warning Device...

    Science.gov (United States)

    2011-09-20

    ... Sensor and Warning Device Systems; Examination and Test Requirements ACTION: Notice of request for public... Coal Mining. OMB 1219-0145 has been renamed Automatic Fire Sensor and Warning Device Systems... to a task in July 2011; OMB 1219-0073 subsumed Sec. 75.1103-5(a)(2)(ii) Automatic fire sensor...

  19. Data collection system. Volume 1, Overview and operators manual; Volume 2, Maintenance manual; Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, R.B.; Bauder, M.E.; Boyer, W.B.; French, R.E.; Isidoro, R.J.; Kaestner, P.C.; Perkins, W.G.

    1993-09-01

    Sandia National Laboratories (SNL) Instrumentation Development Department was tasked by the Defense Nuclear Agency (DNA) to record data on Tektronix RTD720 Digitizers on the HUNTERS TROPHY field test conducted at the Nevada Test Site (NTS) on September 18, 1992. This report contains a overview and description of the computer hardware and software that was used to acquire, reduce, and display the data. The document is divided into two volumes: an overview and operators manual (Volume 1) and a maintenance manual (Volume 2).

  20. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  1. Automatic data acquisition of anthropological measurements

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O

    1993-01-01

    A computer program in BASIC is presented which enables the input of measurement data from a caliper directly into specific records in a dBASE IV or PARADOX database. The program circumvents the tedious procedure of first recording measurement data manually and then entering the data into a computer....... Thus much time can be saved and the risk of wrong data entry is lowered. The program was easy to use, and no significant problems were encountered. Necessary hardware is a standard IBM compatible desktop computer, Mitotoyu Digimatic (TM) calipers and a Mitotoyu Digimatic MUX-10 Multiplexer (TM)....

  2. An automatic monitoring system of leak current for testing TGC detectors based on LabVIEW

    International Nuclear Information System (INIS)

    An automatic monitoring system of leak current for testing TGC detectors with high voltage was set up by using the graphic LabVIEW platform and NI 4351 data acquisition card. The leak current was automatically monitored and recorded with this system, the time and the value of the leak current were showed instantly. Good efficiency and precision of monitoring were obtained. (authors)

  3. AUTOMATICALLY CONVERTING TABULAR DATA TO RDF: AN ONTOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Kumar Sharma

    2015-07-01

    Full Text Available Information residing in relational databases and delimited file systems are inadequate for reuse and sharing over the web. These file systems do not adhere to commonly set principles for maintaining data harmony. Due to these reasons, the resources have been suffering from lack of uniformity, heterogeneity as well as redundancy throughout the web. Ontologies have been widely used for solving such type of problems, as they help in extracting knowledge out of any information system. In this article, we focus on extracting concepts and their relations from a set of CSV files. These files are served as individual concepts and grouped into a particular domain, called the domain ontology. Furthermore, this domain ontology is used for capturing CSV data and represented in RDF format retaining links among files or concepts. Datatype and object properties are automatically detected from header fields. This reduces the task of user involvement in generating mapping files. The detail analysis has been performed on Baseball tabular data and the result shows a rich set of semantic information

  4. An efficient automatic firearm identification system

    Science.gov (United States)

    Chuan, Zun Liang; Liong, Choong-Yeun; Jemain, Abdul Aziz; Ghani, Nor Azura Md.

    2014-06-01

    Automatic firearm identification system (AFIS) is highly demanded in forensic ballistics to replace the traditional approach which uses comparison microscope and is relatively complex and time consuming. Thus, several AFIS have been developed for commercial and testing purposes. However, those AFIS are still unable to overcome some of the drawbacks of the traditional firearm identification approach. The goal of this study is to introduce another efficient and effective AFIS. A total of 747 firing pin impression images captured from five different pistols of same make and model are used to evaluate the proposed AFIS. It was demonstrated that the proposed AFIS is capable of producing firearm identification accuracy rate of over 95.0% with an execution time of less than 0.35 seconds per image.

  5. Reliability of the TJ-II Power Supply System: collection and analysis of the operational experience data

    International Nuclear Information System (INIS)

    An effort to develop a fusion specific component reliability database is being carried out by international organizations such as EURATOM and the International Energy Agency (IAE). Moreover, several fusion related devices are involved in the collection of operational experience. In this frame, TJ-II is performing a reliability assessment of its main systems initiating the evaluation process with the Power Supply System (PSS). The PSS has been chosen because it is one of the most critical systems in TJ-II. During a TJ-II pulse, the provision of magnetic fields requires a total amount of power of almost 50 MW. Such amount of power is supplied by the national grid and a 140 MVA flywheel generator (100 MJ, 15 kV 100 Hz). Since the TJ-II loads require direct current, a set of thyristor converters, transformers and circuit breakers for each coil system are being used. Power requirements range from 5 kA (100 V) to 32 kA (1000 V). Also the heating systems (ECRH and NBI) load the PSS with additional 15 MVA. Failure data of these main components and components from auxiliary systems (rectifiers cooling, uninterrupted power supply system, control system ...) have been collected from PSS operation notes and personnel interviewing. Data related to general operation, campaign schedules, and maintenance have been collected from TJ-II engineering annotations, TJ-II web based electronic-board and TJ-II campaign archives. A database with date, time, pulse number, failure description, failure mode and other related information has been implemented. Failures and malfunctions in the PSS have been identified and processed, including information on failure modes and, where possible, causes of the failures. About 1700 failures and malfunctions have been identified in the period May 1998 - Dec 2004 (1309 of them in operational days and 381 during tests or maintenance task). Most malfunctions come from spurious signals over the circuit breaker and from the generation system. Main

  6. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-01-01

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology. PMID:20375445

  7. Collective Enrichment of OpenStreetMap Spatial Data Through Vehicles Equipped with Driver Assistance Systems

    OpenAIRE

    Sachdeva, Arjun

    2015-01-01

    Navigation systems are one of the most commonly found electronic gadgets in modern vehicles nowadays. Alongside navigation units this technology is made readily available to individuals in everyday devices such as a mobile phone. Digital maps which come preloaded on these devices accommodate within them an extensive dataset of spatial information from around the globe which aids the driver in achieving a well guided driving experience. Apart from being essential for navigation this sensor inf...

  8. QuaDoSta - a freely configurable system which facilitates multi-centric data collection for healthcare and medical research

    Directory of Open Access Journals (Sweden)

    Albrecht, Ulrike

    2007-07-01

    Full Text Available This article describes QuaDoSta (quality assurance, documentation and statistics, a flexible documentation system as well as a data collection and networking platform for medical facilities. The user can freely define the required documentation masks which are easily expandable and can be adapted to individual requirements without the need for additional programming. To avoid duplication, data transfer interfaces can be configured flexibly to external sources such as patient management systems used in surgeries or hospital information systems. The projects EvaMed (Evaluation Anthroposophical Medicine and the Network Oncology are two scientific research projects which have been successfully established as nationally active networks on the basis of QuaDoSta. The EvaMed-Network serves as a modern pharmacovigilance project for the documentation of adverse drug events. All prescription data are electronically recorded to assess the relative risk of drugs. The Network Oncology was set up as a documentation system in four hospitals and seven specialist oncology practices where a complete record of all oncological therapies is being carried out to uniform standards on the basis of the ‘basic documentation for tumour patients’ (BDT developed by the German Cancer Society. The QuaDoSta solution system made it possible to cater for the specific requirements of the presented projects. The following features of the system proved to be highly advantageous: flexible setup of catalogues and user friendly customisation and extensions, complete dissociation of system setup and documentation content, multi-centre networkability, and configurable data transfer interfaces.

  9. Automatic data distribution for massively parallel processors

    OpenAIRE

    García Almiñana, Jordi

    1997-01-01

    Massively Parallel Processor systems provide the required computational power to solve most large scale High Performance Computing applications. Machines with physically distributed memory allow a cost-effective way to achieve this performance, however, these systems are very diffcult to program and tune. In a distributed-memory organization each processor has direct access to its local memory, and indirect access to the remote memories of other processors. But the cost of accessing a local m...

  10. Platform attitude data acquisition system

    Digital Repository Service at National Institute of Oceanography (India)

    Afzulpurkar, S.

    A system for automatic acquisition of underwater platform attitude data has been designed, developed and tested in the laboratory. This is a micro controller based system interfacing dual axis inclinometer, high-resolution digital compass...

  11. Design of a real-time tax-data monitoring intelligent card system

    Science.gov (United States)

    Gu, Yajun; Bi, Guotang; Chen, Liwei; Wang, Zhiyuan

    2009-07-01

    To solve the current problem of low efficiency of domestic Oil Station's information management, Oil Station's realtime tax data monitoring system has been developed to automatically access tax data of Oil pumping machines, realizing Oil-pumping machines' real-time automatic data collection, displaying and saving. The monitoring system uses the noncontact intelligent card or network to directly collect data which can not be artificially modified and so seals the loopholes and improves the tax collection's automatic level. It can perform real-time collection and management of the Oil Station information, and find the problem promptly, achieves the automatic management for the entire process covering Oil sales accounting and reporting. It can also perform remote query to the Oil Station's operation data. This system has broad application future and economic value.

  12. Automation of plasma-process fultext bibliography databases. An on-line data-collection, data-mining and data-input system

    International Nuclear Information System (INIS)

    Searching for relevant data, information retrieval, data extraction and data input are time- and resource-consuming activities in most data centers. Here we develop a Linux system automating the process in case of bibliography, abstract and fulltext databases. The present system is an open-source free-software low-cost solution that connects the target and provider databases in cyberspace through various web publishing formats. The abstract/fulltext relevance assessment is interfaced to external software modules. (author)

  13. Automatic Generation of Symbolic Model for Parameterized Synchronous Systems

    Institute of Scientific and Technical Information of China (English)

    Wei-Wen Xu

    2004-01-01

    With the purpose of making the verification of parameterized system more general and easier, in this paper, a new and intuitive language PSL (Parameterized-system Specification Language) is proposed to specify a class of parameterized synchronous systems. From a PSL script, an automatic method is proposed to generate a constraint-based symbolic model. The model can concisely symbolically represent the collections of global states by counting the number of processes in a given state. Moreover, a theorem has been proved that there is a simulation relation between the original system and its symbolic model. Since the abstract and symbolic techniques are exploited in the symbolic model, state-explosion problem in traditional verification methods is efficiently avoided. Based on the proposed symbolic model, a reachability analysis procedure is implemented using ANSI C++ on UNIX platform. Thus, a complete tool for verifying the parameterized synchronous systems is obtained and tested for some cases. The experimental results show that the method is satisfactory.

  14. Human-system Interfaces for Automatic Systems

    Energy Technology Data Exchange (ETDEWEB)

    OHara, J.M.; Higgins,J. (BNL); Fleger, S.; Barnes V. (NRC)

    2010-11-07

    Automation is ubiquitous in modern complex systems, and commercial nuclear- power plants are no exception. Automation is applied to a wide range of functions including monitoring and detection, situation assessment, response planning, and response implementation. Automation has become a 'team player' supporting personnel in nearly all aspects of system operation. In light of its increasing use and importance in new- and future-plants, guidance is needed to conduct safety reviews of the operator's interface with automation. The objective of this research was to develop such guidance. We first characterized the important HFE aspects of automation, including six dimensions: levels, functions, processes, modes, flexibility, and reliability. Next, we reviewed literature on the effects of all of these aspects of automation on human performance, and on the design of human-system interfaces (HSIs). Then, we used this technical basis established from the literature to identify general principles for human-automation interaction and to develop review guidelines. The guidelines consist of the following seven topics: automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, our study identified several topics for additional research.

  15. Automatic plant start-up system for nuclear power station

    International Nuclear Information System (INIS)

    An automatic plant start-up system using a process computer has been applied to the unit No. 1 of Onagawa Nuclear Power Station, Tohoku Electric Power Co., Inc. This is the world's first commercial-base system for LWRs. Turbine start-up and power control by reactor recirculation flow are automated to reduce operator's labor and to improve the efficiency and accuracy of plant operation. The test data and the results of practical operation have proved the performance of the system is satisfactory. Major functions, configuration, and the results of performance tests at factory and at site are represented here. (author)

  16. 2013 International Conference on Mechatronics and Automatic Control Systems

    CERN Document Server

    2014-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the 2013 International Conference on Mechatronics and Automatic Control Systems held in Hangzhou, China on August 10-11, 2013. .

  17. Automatic Extraction of Metadata from Scientific Publications for CRIS Systems

    Science.gov (United States)

    Kovacevic, Aleksandar; Ivanovic, Dragan; Milosavljevic, Branko; Konjovic, Zora; Surla, Dusan

    2011-01-01

    Purpose: The aim of this paper is to develop a system for automatic extraction of metadata from scientific papers in PDF format for the information system for monitoring the scientific research activity of the University of Novi Sad (CRIS UNS). Design/methodology/approach: The system is based on machine learning and performs automatic extraction…

  18. The BENTO Box: Development and field-testing of a new satellite-linked data collection system for multiparameter volcano monitoring

    Science.gov (United States)

    Roman, D. C.; Behar, A.; Elkins-Tanton, L. T.

    2014-12-01

    Predicting volcanic activity requires continuous monitoring for signals of magmatic unrest in harsh, often remote environments. BENTO is a next-generation monitoring system, currently in prototype testing, that is highly portable, low-cost, rapidly deployable, and entirely autonomous. Such a system could be used to provide critical monitoring and data collection capabilities during rapid-onset eruptions, or to provide a crude baseline monitor at large numbers of remote volcanoes to 'flag' the onset of unrest so that costlier resources such as specialized instrumentation can be deployed in the appropriate place at the appropriate time. The BENTO 1 (low-rate data) prototype currently comprises off-the-shelf volcanic gas sensors (SO2, CO2, Fl, Cl, and Br), a weather station (temperature, wind speed, wind direction, rainfall, humidity, pressure), and telemetry via Iridium modem. In baseline mode, BENTO 1 takes a measurement from all of its sensors every two hours and automatically sends the measurements through Iridium to a server that posts them to a dedicated and customizable web page. The measurement interval and other sensor parameters (pumping time, sensor constants) can be adjusted directly or remotely (through the Iridium network) as needed. Currently, BENTO 1 is deployed at Mt. Etna, Italy; Telica Volcano, Nicaragua, Hengill Volcano, Iceland; and Hekla Volcano, Iceland. The BENTO 2 (high-rate) system is motivated by a need to avoid having to telemeter raw seismic data, which at 20-100 Hz/channel is far too voluminous for cost- and power-effective transmission through satellite networks such as Iridium. Our solution is to regularly transmit only state-of-health information and descriptions of the seismic data (e.g., 'triggered' seismic event rates and amplitudes), rather than the data itself. The latter can be accomplished through on-board data analysis and reduction at the installation site. Currently, it is possible to request specific time segments of raw

  19. Discrete Model Reference Adaptive Control System for Automatic Profiling Machine

    OpenAIRE

    Peng Song; Guo-kai Xu; Xiu-chun Zhao

    2012-01-01

    Automatic profiling machine is a movement system that has a high degree of parameter variation and high frequency of transient process, and it requires an accurate control in time. In this paper, the discrete model reference adaptive control system of automatic profiling machine is discussed. Firstly, the model of automatic profiling machine is presented according to the parameters of DC motor. Then the design of the discrete model reference adaptive control is proposed, and the control rules...

  20. The comparison of milk production and quality in cows from conventional and automatic milking systems

    Directory of Open Access Journals (Sweden)

    Renata Touov

    2014-12-01

    Full Text Available The objective of this study was to evaluate the effects of two different types of milking systems (conventional parlour vs. automatic milking system and the season of the year on the composition and hygienic quality of milk from Czech Fleckvieh cows. A total of 500 cows were involved; 200 and 300 in conventional and automatic milking systems, respectively. Bulk milk samples were collected for 12 months from July 2010 to June 2011. The following milk components and quality indicators were determined: % of fat, % of protein, % of lactose, % of fat-free dry matter (FFDM, % of casein, urea content, somatic cell count (SSC, total germ count (TGC and milk freezing point (FP. The data were processed and evaluated with MS Excel and the statistical software SAS 9.1. Significantly higher (P<0.05 0.01 contents of fat, protein, FFDM and casein and increased TGC were observed in the automatic milking system, whereas SCC and FP were significantly lower (P<0.01. The highest contents of fat, protein and casein, and the lowest lactose content were found in the winter season. The highest contents of FFDM, urea and SCC were observed in autumn, whereas TGC was highest in summer (P<0.05 0.01. Only FP was not influenced by the season.

  1. Automatic spectral transmittance measurement system for DWDM filters

    Science.gov (United States)

    Chang, Gao-Wei; Heish, Ming-Yu

    2003-08-01

    For many years, fiber-optics communication has become an essential part of the development of our modern society. For example, its significance comes from the increasing demands on real-time image transmission, multimedia communication, distance learning, video-conferencing, video telephone, and cable TV, etc. This paper is to develop an automatic transmittance measurement system for a DWDM (dense wavelength division multiplexing) filter. In this system, a grating-based monochromators is devised to generate a collection of monochromatic light with various wavelengths, instead of using an expensive tunable laser. From this approach, the cost of the proposed system will be much lower than that of those having the same functions, by one order. In addition, we simulate the spectral filtering to investigate the resolving power of the system. It appears that our simulations give quite satisfactory results.

  2. Fast Automatic Precision Tree Models from Terrestrial Laser Scanner Data

    Directory of Open Access Journals (Sweden)

    Mathias Disney

    2013-01-01

    Full Text Available This paper presents a new method for constructing quickly and automatically precision tree models from point clouds of the trunk and branches obtained by terrestrial laser scanning. The input of the method is a point cloud of a single tree scanned from multiple positions. The surface of the visible parts of the tree is robustly reconstructed by making a flexible cylinder model of the tree. The thorough quantitative model records also the topological branching structure. In this paper, every major step of the whole model reconstruction process, from the input to the finished model, is presented in detail. The model is constructed by a local approach in which the point cloud is covered with small sets corresponding to connected surface patches in the tree surface. The neighbor-relations and geometrical properties of these cover sets are used to reconstruct the details of the tree and, step by step, the whole tree. The point cloud and the sets are segmented into branches, after which the branches are modeled as collections of cylinders. From the model, the branching structure and size properties, such as volume and branch size distributions, for the whole tree or some of its parts, can be approximated. The approach is validated using both measured and modeled terrestrial laser scanner data from real trees and detailed 3D models. The results show that the method allows an easy extraction of various tree attributes from terrestrial or mobile laser scanning point clouds.

  3. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Science.gov (United States)

    Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.

    2016-06-01

    Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  4. Automatic data acquisition and processing with the APEX goniometer, PDP 11/03 and IBM 370 computer, with application to surface texture studies of magnox fuel cladding

    International Nuclear Information System (INIS)

    This report is written in two parts and is in the form of a working manual enabling the user to operate the described system and make modifications to suit individuals requirements. Part 1 describes the general procedures required for automatic data acquisition and processing incorporating the APEX goniometer, PDP11/03 and IBM370/168 computers. A listing of the program illustrating the retrieval of data from the PDP11/03 floppy disc system is also included. Part 2 describes in detail the application of automatic data collection to texture studies of magnox fuel cladding. It is designed to enable the user to understand the method of data collection and the use of the computer facilities at Harwell including obtaining a graphical display via the GHOST system. This section incorporates a listing of the display program and the results obtained from the magnox fuel cladding. (author)

  5. Meteorological observatory for Antarctic data collection

    International Nuclear Information System (INIS)

    In the last years, a great number of automatic weather stations was installed in Antarctica, with the aim to examine closely the weather and climate of this region and to improve the coverage of measuring points on the Antarctic surface. In 1987 the Italian Antarctic Project started to set up a meteorological network, in an area not completely covered by other countries. Some of the activities performed by the meteorological observatory, concerning technical functions such as maintenance of the AWS's and the execution of radio soundings, or relating to scientific purposes such as validation and elaboration of collected data, are exposed. Finally, some climatological considerations on the thermal behaviour of the Antarctic troposphere such as 'coreless winter', and on the wind field, including katabatic flows in North Victoria Land are described

  6. RAMS data collection under Arctic conditions

    International Nuclear Information System (INIS)

    Reliability, availability, maintainability and supportability analysis is an important step in the design and operation of production processes and technology. Historical data such as time between failures and time to repairs play an important role in such analysis. The data must reflect the conditions that equipment has experienced during its operating time. To have a precise understanding of the conditions experienced, all influence factors on the failure and repair processes of a production facility in Arctic environment need to be identified and collected in the database. However, there is a lack of attention to collect the effect of influence factors in the reliability, availability, maintainability and supportability database. Hence, the aim of this paper is to discuss the challenges of the available methods of data collection and suggest a methodology for data collection considering the effect of environmental conditions. Application of the methodology will make the historical RAMS data of a system more applicable and useful for the design and operation of the system in different types of operational environments. - Highlights: • The challenges related to use of the available RAMS data is discussed. • It is important to collect information about operational condition in RAMS data. • A methodology for RAMS data collection considering environment condition is suggested. • Information about influence factors will make the result of RAMS analysis more applicable

  7. Reactor protection system with automatic self-testing and diagnostic

    International Nuclear Information System (INIS)

    A reactor protection system is disclosed having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Automatic detection and discrimination against failed sensors allows the reactor protection system to automatically enter a known state when sensor failures occur. Cross communication of sensor readings allows comparison of four theoretically ''identical'' values. This permits identification of sensor errors such as drift or malfunction. A diagnostic request for service is issued for errant sensor data. Automated self test and diagnostic monitoring, sensor input through output relay logic, virtually eliminate the need for manual surveillance testing. This provides an ability for each division to cross-check all divisions and to sense failures of the hardware logic. 16 figs

  8. Automatic Battery Swap System for Home Robots

    Directory of Open Access Journals (Sweden)

    Juan Wu

    2012-12-01

    Full Text Available This paper presents the design and implementation of an automatic battery swap system for the prolonged activities of home robots. A battery swap station is proposed to implement battery off‐line recharging and on‐line exchanging functions. It consists of a loading and unloading mechanism, a shifting mechanism, a locking device and a shell. The home robot is a palm‐sized wheeled robot with an onboard camera and a removable battery case in the front. It communicates with the battery swap station wirelessly through ZigBee. The influences of battery case deflection and robot docking deflection on the battery swap operations have been investigated. The experimental results show that it takes an average time of 84.2s to complete the battery swap operations. The home robot does not have to wait several hours for the batteries to be fully charged. The proposed battery swap system is proved to be efficient in home robot applications that need the robots to work continuously over a long period.

  9. PLC Based Automatic Multistoried Car Parking System

    Directory of Open Access Journals (Sweden)

    Swanand S .Vaze

    2014-12-01

    Full Text Available This project work presents the study and design of PLC based Automatic Multistoried Car Parking System. Multistoried car parking is an arrangement which is used to park a large number of vehicles in least possible place. For making this arrangement in a real plan very high technological instruments are required. In this project a prototype of such a model is made. This prototype model is made for accommodating twelve cars at a time. Availability of the space for parking is detected by optical proximity sensor which is placed on the pallet. A motor controlled elevator is used to lift the cars. Elevator status is indicated by LED which is placed on ground floor. Controlling of the platforms and checking the vacancies is done by PLC. For unparking of car, keyboard is interfaced with the model for selection of required platform. Automation is done to reduce requirement of space and also to reduce human errors, which in-turn results in highest security and greatest flexibility. Due to these advantages, this system can be used in hotels, railway stations, airports where crowding of car is more.

  10. Automatic Traffic Monitoring from an Airborne Wide Angle Camera System

    OpenAIRE

    Rosenbaum, Dominik; Charmette, Baptiste; Kurz, Franz; Suri, Sahil; Thomas, Ulrike; Reinartz, Peter

    2008-01-01

    We present an automatic traffic monitoring approach using data of an airborne wide angle camera system. This camera, namely the “3K-Camera”, was recently developed at the German Aerospace Center (DLR). It has a coverage of 8 km perpendicular to the flight direction at a flight height of 3000 m with a resolution of 45 cm and is capable to take images at a frame rate of up to 3 fps. Based on georeferenced images obtained from this camera system, a near real-time processing chain containing roa...

  11. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  12. Automatic Arabic Hand Written Text Recognition System

    Directory of Open Access Journals (Sweden)

    I. A. Jannoud

    2007-01-01

    Full Text Available Despite of the decent development of the pattern recognition science applications in the last decade of the twentieth century and this century, text recognition remains one of the most important problems in pattern recognition. To the best of our knowledge, little work has been done in the area of Arabic text recognition compared with those for Latin, Chins and Japanese text. The main difficulty encountered when dealing with Arabic text is the cursive nature of Arabic writing in both printed and handwritten forms. An Automatic Arabic Hand-Written Text Recognition (AHTR System is proposed. An efficient segmentation stage is required in order to divide a cursive word or sub-word into its constituting characters. After a word has been extracted from the scanned image, it is thinned and its base line is calculated by analysis of horizontal density histogram. The pattern is then followed through the base line and the segmentation points are detected. Thus after the segmentation stage, the cursive word is represented by a sequence of isolated characters. The recognition problem thus reduces to that of classifying each character. A set of features extracted from each individual characters. A minimum distance classifier is used. Some approaches are used for processing the characters and post processing added to enhance the results. Recognized characters will be appended directly to a word file which is editable form.

  13. Data mining based study on quality of water level data of Three Gorges Reservoir Automatic Dispatching System%基于数据挖掘的三峡水库调度自动化系统水位数据质量研究

    Institute of Scientific and Technical Information of China (English)

    杨旭; 刘宇

    2011-01-01

    三峡水库调度自动化系统负责收集、分析近两百个遥测水位数据,且目前能够以30 s~10 min的周期进行数据采集、传输.但是由于设备、通信等原因,异常数据将会在系统中产生,有时也会缺数,这些因素对数据质量有一定的影响.而在海量数据中进行人工错误数据查找,不太现实.为解决此问题,本文引入完整率和有效性来衡量数据质量,利用数据挖掘技术进行了可行性分析,旨在为解决同类问题提供参考.%Three Gorges Reservoir Automatic Dispatching System is responsible for collecting and analyzing nearly 200 telemetry water level data, and that the data collection and transmission can be made with the frequencies from 30 s to 10 min at present However, some abnormal data will always occur in the system and sometimes even miss some data due to the relevant causations from the equipment, communication, which have certain impacts on the data quality. Nevertheless, it is not realistic to manually find the error data from the related mass data. For solving this problem, the concept of integrity and effectiveness is introduced herein to measure the quality of data, and then a feasibility analysis is made based on the technology of data mining, so as to provide a reference for solving the similar problems concerned

  14. The automatic calibration of Korean VLBI Network data

    CERN Document Server

    Hodgson, Jeffrey A; Zhao, Guang-Yao; Algaba, Juan-Carlos; Yun, Youngjoo; Jung, Taehyun; Byun, Do-Young

    2016-01-01

    The calibration of Very Long Baseline Interferometry (VLBI) data has long been a time consuming process. The Korean VLBI Network (KVN) is a simple array consisting of three identical antennas. Because four frequencies are observed simultaneously, phase solutions can be transferred from lower frequencies to higher frequencies in order to improve phase coherence and hence sensitivity at higher frequencies. Due to the homogeneous nature of the array, the KVN is also well suited for automatic calibration. In this paper we describe the automatic calibration of single-polarisation KVN data using the KVN Pipeline and comparing the results against VLBI data that has been manually reduced. We find that the pipelined data using phase transfer produces better results than a manually reduced dataset not using the phase transfer. Additionally we compared the pipeline results with a manually reduced phase-transferred dataset and found the results to be identical.

  15. The Automatic Calibration of Korean VLBI Network Data

    Science.gov (United States)

    Hodgson, Jeffrey A.; Lee, Sang-Sung; Zhao, Guang-Yao; Algaba, Juan-Carlos; Yun, Youngjoo; Jung, Taehyun; Byun, Do-Young

    2016-08-01

    The calibration of Very Long Baseline Interferometry (VLBI) data has long been a time consuming process. The Korean VLBI Network (KVN) is a simple array consisting of three identical antennas. Because four frequencies are observed simultaneously, phase solutions can be transferred from lower frequencies to higher frequencies in order to improve phase coherence and hence sensitivity at higher frequencies. Due to the homogeneous nature of the array, the KVN is also well suited for automatic calibration. In this paper we describe the automatic calibration of single-polarisation KVN data using the KVN Pipeline and comparing the results against VLBI data that has been manually reduced. We find that the pipelined data using phase transfer produces better results than a manually reduced dataset not using the phase transfer. Additionally we compared the pipeline results with a manually reduced phase-transferred dataset and found the results to be identical.

  16. SABER-School Finance : Data Collection Instrument

    OpenAIRE

    World Bank

    2015-01-01

    The aim of the SABER-school finance initiative is to collect, analyze and disseminate comparable data about education finance systems across countries. SABER-school finance assesses education finance systems along six policy goals: (i) ensuring basic conditions for learning; (ii) monitoring learning conditions and outcomes; (iii) overseeing service delivery; (iv) budgeting with adequate an...

  17. SABER-School Finance: Data Collection Instrument

    Science.gov (United States)

    King, Elizabeth; Patrinos, Harry; Rogers, Halsey

    2015-01-01

    The aim of the SABER-school finance initiative is to collect, analyze and disseminate comparable data about education finance systems across countries. SABER-school finance assesses education finance systems along six policy goals: (i) ensuring basic conditions for learning; (ii) monitoring learning conditions and outcomes; (iii) overseeing…

  18. Simple Approaches to Improve the Automatic Inventory of ZEBRA Crossing from Mls Data

    Science.gov (United States)

    Arias, P.; Riveiro, B.; Soilán, M.; Díaz-Vilariño, L.; Martínez-Sánchez, J.

    2015-08-01

    The city management is increasingly supported by information technologies, leading to paradigms such as smart cities, where decision-makers, companies and citizens are continuously interconnected. 3D modelling turns of great relevance when the city has to be managed making use of geospatial databases or Geographic Information Systems. On the other hand, laser scanning technology has experienced a significant growth in the last years, and particularly, terrestrial mobile laser scanning platforms are being more and more used with inventory purposes in both cities and road environments. Consequently, large datasets are available to produce the geometric basis for the city model; however, this data is not directly exploitable by management systems constraining the implementation of the technology for such applications. This paper presents a new algorithm for the automatic detection of zebra crossing. The algorithm is divided in three main steps: road segmentation (based on a PCA analysis of the points contained in each cycle of collected by a mobile laser system), rasterization (conversion of the point cloud to a raster image coloured as a function of intensity data), and zebra crossing detection (using the Hough Transform and logical constrains for line classification). After evaluating different datasets collected in three cities located in Northwest Spain (comprising 25 strips with 30 visible zebra crossings) a completeness of 83% was achieved.

  19. AUTOMATIC REGISTRATION OF MULTI-SOURCE DATA USING MUTUAL INFORMATION

    OpenAIRE

    E. G. Parmehr; Zhang, C.; C. S. Fraser

    2012-01-01

    Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI) as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI) approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imager...

  20. Measuring Service Reliability Using Automatic Vehicle Location Data

    Directory of Open Access Journals (Sweden)

    Zhenliang Ma

    2014-01-01

    Full Text Available Bus service reliability has become a major concern for both operators and passengers. Buffer time measures are believed to be appropriate to approximate passengers' experienced reliability in the context of departure planning. Two issues with regard to buffer time estimation are addressed, namely, performance disaggregation and capturing passengers’ perspectives on reliability. A Gaussian mixture models based method is applied to disaggregate the performance data. Based on the mixture models distribution, a reliability buffer time (RBT measure is proposed from passengers’ perspective. A set of expected reliability buffer time measures is developed for operators by using different spatial-temporal levels combinations of RBTs. The average and the latest trip duration measures are proposed for passengers that can be used to choose a service mode and determine the departure time. Using empirical data from the automatic vehicle location system in Brisbane, Australia, the existence of mixture service states is verified and the advantage of mixture distribution model in fitting travel time profile is demonstrated. Numerical experiments validate that the proposed reliability measure is capable of quantifying service reliability consistently, while the conventional ones may provide inconsistent results. Potential applications for operators and passengers are also illustrated, including reliability improvement and trip planning.

  1. A General Method for Module Automatic Testing in Avionics Systems

    Directory of Open Access Journals (Sweden)

    Li Ma

    2013-05-01

    Full Text Available The traditional Automatic Test Equipment (ATE systems are insufficient to cope with the challenges of testing more and more complex avionics systems. In this study, we propose a general method for module automatic testing in the avionics test platform based on PXI bus. We apply virtual instrument technology to realize the automatic testing and the fault reporting of signal performance. Taking the avionics bus ARINC429 as an example, we introduce the architecture of automatic test system as well as the implementation of algorithms in Lab VIEW. The comprehensive experiments show the proposed method can effectively accomplish the automatic testing and fault reporting of signal performance. It greatly improves the generality and reliability of ATE in avionics systems.

  2. Monitoring, analysis and classification of vegetation and soil data collected by a small and lightweight hyperspectral imaging system

    Science.gov (United States)

    Mönnig, Carsten

    2014-05-01

    The increasing precision of modern farming systems requires a near-real-time monitoring of agricultural crops in order to estimate soil condition, plant health and potential crop yield. For large sized agricultural plots, satellite imagery or aerial surveys can be used at considerable costs and possible time delays of days or even weeks. However, for small to medium sized plots, these monitoring approaches are cost-prohibitive and difficult to assess. Therefore, we propose within the INTERREG IV A-Project SMART INSPECTORS (Smart Aerial Test Rigs with Infrared Spectrometers and Radar), a cost effective, comparably simple approach to support farmers with a small and lightweight hyperspectral imaging system to collect remotely sensed data in spectral bands in between 400 to 1700nm. SMART INSPECTORS includes the whole remote sensing processing chain of small scale remote sensing from sensor construction, data processing and ground truthing for analysis of the results. The sensors are mounted on a remotely controlled (RC) Octocopter, a fixed wing RC airplane as well as on a two-seated Autogyro for larger plots. The high resolution images up to 5cm on the ground include spectra of visible light, near and thermal infrared as well as hyperspectral imagery. The data will be analyzed using remote sensing software and a Geographic Information System (GIS). The soil condition analysis includes soil humidity, temperature and roughness. Furthermore, a radar sensor is envisaged for the detection of geomorphologic, drainage and soil-plant roughness investigation. Plant health control includes drought stress, vegetation health, pest control, growth condition and canopy temperature. Different vegetation and soil indices will help to determine and understand soil conditions and plant traits. Additional investigation might include crop yield estimation of certain crops like apples, strawberries, pasture land, etc. The quality of remotely sensed vegetation data will be tested with

  3. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade - Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A; Combs, Stephanie E

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient's preference. In this context, the term "Big Data" evolved also in health care. The "hype" is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the "three V's": volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application. PMID:27066456

  4. Review of developments in electronic, clinical data collection and documentation systems over the last decade – Are we ready for Big Data in routine health care?

    Directory of Open Access Journals (Sweden)

    Kerstin Anne Kessel

    2016-03-01

    Full Text Available Recently, information availability has become more elaborate and wide spread, and treatment decisions are based on a multitude of factors including imaging, molecular or pathological markers, surgical results and patient’s preference. In this context the term Big Data evolved also in health care. The hype is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only a heterogeneous and voluminous amount of data must be evaluated, it is also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data - the three V’s: volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or post-processing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important and economically viable field of application.

  5. Spectral Curve Fitting for Automatic Hyperspectral Data Analysis

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    Automatic discovery and curve fitting of absorption bands in hyperspectral data can enable the analyst to identify materials present in a scene by comparison with library spectra. This procedure is common in laboratory spectra, but is challenging for sparse hyperspectral data. A procedure for robust discovery of overlapping bands in hyperspectral data is described in this paper. The method is capable of automatically discovering and fitting symmetric absorption bands, can separate overlapping absorption bands in a stable manner, and has relatively low sensitivity to noise. A comparison with techniques already available in the literature is presented using simulated spectra. An application is demonstrated utilizing the shortwave infrared (2.0-2.5 micron or 5000-4000 cm-1) region. A small hyperspectral scene is processed to demonstrate the ability of the method to detect small shifts in absorption wavelength caused by varying white mica chemistry in a natural setting.

  6. Automatic sample changer and microprocessor controlled data router for a small bulk-sample counter

    International Nuclear Information System (INIS)

    We have designed a gamma-ray counting system for small bulk-samples that incorporates an automatic sample-changer and multiple data-output device. The system includes an inexpensive microprocessor and is constructed mainly of materials and equipment commonly available at most institutions engaged in nuclear research

  7. Automatic neutron PSD transmission from a process computer to a timeshare system

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, J.B.; Sides, W.H. Jr.

    1977-04-01

    A method for automatically telephoning, connecting, and transmitting neutron power-spectral density data from a CDC-1700 process control computer to a PDP-10 time-share system is described. Detailed program listings and block diagrams are included.

  8. Automatic neutron PSD transmission from a process computer to a timeshare system

    International Nuclear Information System (INIS)

    A method for automatically telephoning, connecting, and transmitting neutron power-spectral density data from a CDC-1700 process control computer to a PDP-10 time-share system is described. Detailed program listings and block diagrams are included

  9. Automatic control system design of laser interferometer

    Science.gov (United States)

    Lu, Qingjie; Li, Chunjie; Sun, Hao; Ren, Shaohua; Han, Sen

    2015-10-01

    There are a lot of shortcomings with traditional optical adjustment in interferometry, such as low accuracy, time-consuming, labor-intensive, uncontrollability, and bad repetitiveness, so we treat the problem by using wireless remote control system. Comparing to the traditional method, the effect of vibration and air turbulence will be avoided. In addition the system has some peculiarities of low cost, high reliability and easy operation etc. Furthermore, the switching between two charge coupled devices (CCDs) can be easily achieved with this wireless remote control system, which is used to collect different images. The wireless transmission is achieved by using Radio Frequency (RF) module and programming the controller, pulse width modulation (PWM) of direct current (DC) motor, real-time switching of relay and high-accuracy displacement control of FAULHABER motor are available. The results of verification test show that the control system has good stability with less than 5% packet loss rate, high control accuracy and millisecond response speed.

  10. Automatic diagnostic methods of nuclear reactor collected signals

    International Nuclear Information System (INIS)

    This work is the first phase of an opwall study of diagnosis limited to problems of monitoring the operating state; this allows to show all what the pattern recognition methods bring at the processing level. The present problem is the research of the control operations. The analysis of the state of the reactor gives a decision which is compared with the history of the control operations, and if there is not correspondence, the state subjected to the analysis will be said 'abnormal''. The system subjected to the analysis is described and the problem to solve is defined. Then, one deals with the gaussian parametric approach and the methods to evaluate the error probability. After one deals with non parametric methods and an on-line detection has been tested experimentally. Finally a non linear transformation has been studied to reduce the error probability previously obtained. All the methods presented have been tested and compared to a quality index: the error probability

  11. Review of Developments in Electronic, Clinical Data Collection, and Documentation Systems over the Last Decade – Are We Ready for Big Data in Routine Health Care?

    Science.gov (United States)

    Kessel, Kerstin A.; Combs, Stephanie E.

    2016-01-01

    Recently, information availability has become more elaborate and widespread, and treatment decisions are based on a multitude of factors, including imaging, molecular or pathological markers, surgical results, and patient’s preference. In this context, the term “Big Data” evolved also in health care. The “hype” is heavily discussed in literature. In interdisciplinary medical specialties, such as radiation oncology, not only heterogeneous and voluminous amount of data must be evaluated but also spread in different styles across various information systems. Exactly this problem is also referred to in many ongoing discussions about Big Data – the “three V’s”: volume, velocity, and variety. We reviewed 895 articles extracted from the NCBI databases about current developments in electronic clinical data management systems and their further analysis or postprocessing procedures. Few articles show first ideas and ways to immediately make use of collected data, particularly imaging data. Many developments can be noticed in the field of clinical trial or analysis documentation, mobile devices for documentation, and genomics research. Using Big Data to advance medical research is definitely on the rise. Health care is perhaps the most comprehensive, important, and economically viable field of application. PMID:27066456

  12. Automatic early warning systems for the environment

    International Nuclear Information System (INIS)

    Computerized, continuous monitoring environmental early warning systems are complex networks that merge measurements with the information technology. Accuracy, consistency, reliability and data quality are their most important features. Several effects may disturb their characteristics: hostile environment, unreliable communications, poor quality of equipment, non qualified users or service personnel. According to our experiences, a number of measures should be taken to enhance system performances and to maintain them at the desired level. In the paper, we are presenting an analysis of system requirements, possible disturbances and corrective measures that give the main directives for the design, construction and exploitation of the environmental early warning systems. Procedures which ensure data integrity and quality are mentioned. Finally, the contemporary system approach based on the LAN/WAN network topology with Intranet/Internet software is proposed, together with case descriptions of two already operating systems, based on computer-network principle. (author)

  13. Automatic removal of outliers in hydrologic time series and quality control of rainfall data: processing a real-time database of the Local System for Flood Monitoring in Klodzko County, Poland

    Science.gov (United States)

    Mizinski, Bartlomiej; Niedzielski, Tomasz; Kryza, Maciej; Szymanowski, Mariusz

    2013-04-01

    Real-time hydrological forecasting requires the highest quality of both hydrologic and meteorological data collected in a given river basin. Large outliers may lead to inaccurate predictions, with substantial departures between observations and prognoses considered even in short term. Although we need the correctness of both riverflow and rainfall data, they cannot be processed in the same way to produce a filtered output. Indeed, hydrologic time series at a given gauge can be interpolated in time domain after having detected suspicious values, however if no outlier has been detected at the upstream sites. In the case of rainfall data, interpolation is not suitable as we cannot verify the potential outliers at a given site against data from other sites especially in the complex terrain. This is due to the fact that very local convective events may occur, leading to large rainfall peaks at a limited space. Hence, instead of interpolating data, we rather perform a flagging procedure that only ranks outliers according to the likelihood of occurrence. Following the aforementioned assumptions, we have developed a few modules that serve a purpose of a fully automated correction of a database that is updated in real-time every 15 minutes, and the main objective of the work was to produce a high-quality database for a purpose of hydrologic rainfall-runoff modeling and ensemble prediction. The database in question is available courtesy of the County Office in Kłodzko (SW Poland), the institution which owns and maintains the Local System for Flood Monitoring in Kłodzko County. The dedicated prediction system, known as HydroProg, is now being built at the University of Wrocław (Poland). As the entire prediction system, the correction modules work automatically in real time and are developed in R language. They are plugged in to a larger IT infrastructure. Hydrologic time series, which are water levels recorded every 15 minutes at 22 gauges located in Kłodzko County, are

  14. MAD data collection - current trends

    International Nuclear Information System (INIS)

    The multi-wavelength anomalous diffraction, or MAD, method of determining protein structure is becoming routine in protein crystallography. An increase in the number of tuneable synchrotrons beamlines coupled with the widespread availability position-sensitive X-ray detectors based on charged-coupled devices and having fast readout raised MAD structure determination to a new and exciting level. Ultra-fast MAD data collection is now possible. Recognition of the value of selenium for phasing protein structures and improvement of methods for incorporating selenium into proteins in the form of selenomethionine have attracted greater interest in the MAD method. Recent developments in crystallographic software are complimenting the above advances, paving the way for rapid protein structure determination. An overview of a typical MAD experiment is described here, with emphasis on the rates and quality of data acquisition now achievable at beamlines developed at third-generation synchrotrons sources

  15. Reconstruction of the sea surface elevation from the analysis of the data collected by a wave radar system

    Science.gov (United States)

    Ludeno, Giovanni; Soldovieri, Francesco; Serafino, Francesco; Lugni, Claudio; Fucile, Fabio; Bulian, Gabriele

    2016-04-01

    X-band radar system is able to provide information about direction and intensity of the sea surface currents and dominant waves in a range of few kilometers from the observation point (up to 3 nautical miles). This capability, together with their flexibility and low cost, makes these devices useful tools for the sea monitoring either coastal or off-shore area. The data collected from wave radar system can be analyzed by using the inversion strategy presented in [1,2] to obtain the estimation of the following sea parameters: peak wave direction; peak period; peak wavelength; significant wave height; sea surface current and bathymetry. The estimation of the significant wave height represents a limitation of the wave radar system because of the radar backscatter is not directly related to the sea surface elevation. In fact, in the last period, substantial research has been carried out to estimate significant wave height from radar images either with or without calibration using in-situ measurements. In this work, we will present two alternative approaches for the reconstruction of the sea surface elevation from wave radar images. In particular, the first approach is based on the basis of an approximated version of the modulation transfer function (MTF) tuned from a series of numerical simulation, following the line of[3]. The second approach is based on the inversion of radar images using a direct regularised least square technique. Assuming a linearised model for the tilt modulation, the sea elevation has been reconstructed as a least square fitting of the radar imaging data[4]. References [1]F. Serafino, C. Lugni, and F. Soldovieri, "A novel strategy for the surface current determination from marine X-band radar data," IEEE Geosci.Remote Sens. Lett., vol. 7, no. 2, pp. 231-235, Apr. 2010. [2]Ludeno, G., Brandini, C., Lugni, C., Arturi, D., Natale, A., Soldovieri, F., Serafino, F. (2014). Remocean System for the Detection of the Reflected Waves from the Costa

  16. Testing of the AP600 automatic depressurization system

    International Nuclear Information System (INIS)

    The Automatic Depressurization System (ADS) of the Westinghouse AP600 reactor will be used to provide controlled depressurization of the reactor coolant system (RCS). This will, in turn allow the initiation and long term operation of gravity driven cooling flow in the RCS. ADS tests were conducted at the VAPORE test facility in Casaccia, Italy through a Technical Cooperation Agreement between Westinghouse, ENEA, SOPREN/ANSALDO, and ENEL to produce data for the development and verification of computer codes to simulate the system. The test program also provided insights about the operation of valves supplied from various vendors that could be used in the AP600 ADS. The data gathered from the tests showed the ability of the ADS design to fulfill its function over the range of conditions expected in the AP600. The tests also demonstrated the abilities of gate and globe valves from several vendors to initiate and terminate an ADS blowdown as could be required in the AP600

  17. Automatic Discovery of Non-Compositional Compounds in Parallel Data

    CERN Document Server

    Melamed, I D

    1997-01-01

    Automatic segmentation of text into minimal content-bearing units is an unsolved problem even for languages like English. Spaces between words offer an easy first approximation, but this approximation is not good enough for machine translation (MT), where many word sequences are not translated word-for-word. This paper presents an efficient automatic method for discovering sequences of words that are translated as a unit. The method proceeds by comparing pairs of statistical translation models induced from parallel texts in two languages. It can discover hundreds of non-compositional compounds on each iteration, and constructs longer compounds out of shorter ones. Objective evaluation on a simple machine translation task has shown the method's potential to improve the quality of MT output. The method makes few assumptions about the data, so it can be applied to parallel data other than parallel texts, such as word spellings and pronunciations.

  18. Data collection architecture for big data - A framework for a research agenda

    NARCIS (Netherlands)

    Hofman, W.J.

    2015-01-01

    As big data is expected to contribute largely to economic growth, scalability of solutions becomes apparent for deployment by organisations. It requires automatic collection and processing of large, heterogeneous data sets of a variety of resources, dealing with various aspects like improving qualit

  19. Automatic Road Centerline Extraction from Imagery Using Road GPS Data

    OpenAIRE

    Chuqing Cao; Ying Sun

    2014-01-01

    Road centerline extraction from imagery constitutes a key element in numerous geospatial applications, which has been addressed through a variety of approaches. However, most of the existing methods are not capable of dealing with challenges such as different road shapes, complex scenes, and variable resolutions. This paper presents a novel method for road centerline extraction from imagery in a fully automatic approach that addresses the aforementioned challenges by exploiting road GPS data....

  20. Automatic Classification of Seafloor Image Data by Geospatial Texture Descriptors

    OpenAIRE

    Lüdtke, Andree

    2014-01-01

    A novel approach for automatic context-sensitive classification of spatially distributed image data is introduced. The proposed method targets applications of seafloor habitat mapping but is generally not limited to this domain or use case. Spatial context information is incorporated in a two-stage classification process, where in the second step a new descriptor for patterns of feature class occurrence according to a generically defined classification scheme is applied. The method is based o...

  1. Automatic Generation of Thematically Focused Information Portals from Web Data

    OpenAIRE

    Sizov, Sergej

    2005-01-01

    Finding the desired information on the Web is often a hard and time-consuming task. This thesis presents the methodology of automatic generation of thematically focused portals from Web data. The key component of the proposed Web retrieval framework is the thematically focused Web crawler that is interested only in a specific, typically small, set of topics. The focused crawler uses classification methods for filtering of fetched documents and identifying most likely relevant Web source...

  2. Design of Combat System Calibration Data Collection and Analysis System%作战系统标校数据采集与分析系统设计

    Institute of Scientific and Technical Information of China (English)

    许晓华; 李鹏

    2015-01-01

    The calibration data of a naval vessel combat system is the reflection of combat system’s function and tech‐nology state of every system equipment .Aiming at the actuality of lack of collection and analysis of the calibration data of a combat system ,a system for calibration data collection and analysis is designed .This paper establishes calibration data entry specification ,designs data analysis applications and analyses processes data .Analysis results show that this system provides a strong support for the management and application of the combat system’s calibration data ,provides references for the rele‐vant decision makers about the equipment maintenance and development .%舰艇作战系统标校数据是作战系统功能、各项系统装备技术状态的反映,针对目前没有对作战系统标校数据进行有效采集与分析的现状,设计了一种标校数据采集与分析系统。文章建立了标校数据录入规范,设计了数据分析功能,对数据进行了分析处理,从而实现了标校数据的有效利用。分析结果证明:该系统对作战系统标校数据的管理和应用提供了支撑,可为装备维护和改进的相关决策人员提供参考。

  3. Automatic digital photo-book making system

    Science.gov (United States)

    Wang, Wiley; Teo, Patrick; Muzzolini, Russ

    2010-02-01

    The diversity of photo products has grown more than ever before. A group of photos are not only printed individually, but also can be arranged in specific order to tell a story, such as in a photo book, a calendar or a poster collage. Similar to making a traditional scrapbook, digital photo book tools allow the user to choose a book style/theme, layouts of pages, backgrounds and the way the pictures are arranged. This process is often time consuming to users, given the number of images and the choices of layout/background combinations. In this paper, we developed a system to automatically generate photo books with only a few initial selections required. The system utilizes time stamps, color indices, orientations and other image properties to best fit pictures into a final photo book. The common way of telling a story is to lay the pictures out in chronological order. If the pictures are proximate in time, they will coincide with each other and are often logically related. The pictures are naturally clustered along a time line. Breaks between clusters can be used as a guide to separate pages or spreads, thus, pictures that are logically related can stay close on the same page or spread. When people are making a photo book, it is helpful to start with chronologically grouped images, but time alone wont be enough to complete the process. Each page is limited by the number of layouts available. Many aesthetic rules also apply, such as, emphasis of preferred pictures, consistency of local image density throughout the whole book, matching a background to the content of the images, and the variety of adjacent page layouts. We developed an algorithm to group images onto pages under the constraints of aesthetic rules. We also apply content analysis based on the color and blurriness of each picture, to match backgrounds and to adjust page layouts. Some of our aesthetic rules are fixed and given by designers. Other aesthetic rules are statistic models trained by using

  4. 15 CFR 990.43 - Data collection.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Data collection. 990.43 Section 990.43... DAMAGE ASSESSMENTS Preassessment Phase § 990.43 Data collection. Trustees may conduct data collection and analyses that are reasonably related to Preassessment Phase activities. Data collection and analysis...

  5. Automatic lesion tracking for a PET/CT based computer aided cancer therapy monitoring system

    Science.gov (United States)

    Opfer, Roland; Brenner, Winfried; Carlsen, Ingwer; Renisch, Steffen; Sabczynski, Jörg; Wiemker, Rafael

    2008-03-01

    Response assessment of cancer therapy is a crucial component towards a more effective and patient individualized cancer therapy. Integrated PET/CT systems provide the opportunity to combine morphologic with functional information. However, dealing simultaneously with several PET/CT scans poses a serious workflow problem. It can be a difficult and tedious task to extract response criteria based upon an integrated analysis of PET and CT images and to track these criteria over time. In order to improve the workflow for serial analysis of PET/CT scans we introduce in this paper a fast lesion tracking algorithm. We combine a global multi-resolution rigid registration algorithm with a local block matching and a local region growing algorithm. Whenever the user clicks on a lesion in the base-line PET scan the course of standardized uptake values (SUV) is automatically identified and shown to the user as a graph plot. We have validated our method by a data collection from 7 patients. Each patient underwent two or three PET/CT scans during the course of a cancer therapy. An experienced nuclear medicine physician manually measured the courses of the maximum SUVs for altogether 18 lesions. As a result we obtained that the automatic detection of the corresponding lesions resulted in SUV measurements which are nearly identical to the manually measured SUVs. Between 38 measured maximum SUVs derived from manual and automatic detected lesions we observed a correlation of 0.9994 and a average error of 0.4 SUV units.

  6. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    International Nuclear Information System (INIS)

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same

  7. Research on Automatic Target Tracking Based on PTZ System

    Directory of Open Access Journals (Sweden)

    Ni Zhang

    2012-11-01

    Full Text Available This paper studies an algorithm of automatic target tracking based on PTZ system. Select the tracking target and set up the target motion trajectory in the video screen. Along the motion trajectory, the system controls the PTZ rotation automatically to track the target real-timely. At the same time, it adjusts the zoom to enlarge or reduce to make sure the target can display on the video screen center clearly at the suitable size. By testing on groups of video, verify the effectiveness of the automatic target tracking algorithm.

  8. The Lick-Gaertner automatic measuring system

    Science.gov (United States)

    Vasilevskis, S.; Popov, W. A.

    1971-01-01

    The Lick-Gaertner automatic equipment has been designed mainly for the measurement of stellar proper motions with reference to galaxies, and consists of two main components: the survey machine and the automatic measuring engine. The survey machine is used for initial inspection and selection of objects for subsequent measurement. Two plates, up to 17 x 17 inches each, are surveyed simultaneously by means of projection on a screen. The approximate positions of objects selected are measured by two optical screws: helical lines cut through an aluminum coating on glass cylinders. These approximate coordinates to a precision of the order of 0.03mm are transmitted to a card punch by encoders connected with the cylinders.

  9. Collecting data in real time with postcards

    DEFF Research Database (Denmark)

    Yee, Kwang Chien; Kanstrup, Anne Marie; Bertelsen, Pernille;

    2013-01-01

    Systems. These methods often involve cross-sectional, retrospective data collection. This paper describes the postcard method for prospective real-time data collection, both in paper format and electronic format. This paper then describes the results obtained using postcard techniques in Denmark and......The success of information technology (IT) in transforming healthcare is often limited by the lack of clear understanding of the context at which the technology is used. Various methods have been proposed to understand healthcare context better in designing and implementing Health Information...

  10. The Diagnostic System of A – 604 Automatic Transmission

    Directory of Open Access Journals (Sweden)

    Czaban Jaroslaw

    2014-09-01

    Full Text Available Automatic gearbox gains increasing popularity in Europe. Little interest in diagnosis of such type of transmission in Poland results from the fact of small share in the whole market of operated cars, so there is a lack of availability of special diagnostic devices. These factors cause issues of expensive repairs, often involving a replacement of subassembly to new or aftermarket one. To a small extent some prophylactic diagnostic tests are conducted, which can eliminate future gearbox system failures. In the paper, the proposition of diagnostic system of popular A - 604 gearbox was presented. The authors are seeking for the possibility of using such type of devices to functional elaboration of gearboxes after renovation. The built system pursues the drive of the researched object, connected with simulated load, where special controller, replacing the original one, is responsible for controlling gearbox operation. This way is used to evaluate the mechanic and hydraulic parts' state. Analysis of signal runs, registered during measurements lets conclude about operation correctness, where as comparison with stock data verifies the technical state of an automatic gearbox.

  11. ATLASWatchMan, a tool for automatized data analysis

    International Nuclear Information System (INIS)

    The ATLAS detector will start soon to take data and many New Physics phenomena are expected. The ATLASWatchMan package has been developed with the principles of CASE (Computer Aided Software Engineering) and it helps the user setting up any analysis by automatically generating the actual analysis code and data files from user settings. ATLASWatchMan provides a light and transparent framework to plug in user-defined cuts and algorithms to look at as many channels the user wants, running the analysis both locally and on the Grid. Examples of analyses run with the package using the latest release of the ATLAS software are shown

  12. Guidelines for Automatic Data Processing Physical Security and Risk Management. Federal Information Processing Standards Publication 31.

    Science.gov (United States)

    National Bureau of Standards (DOC), Washington, DC.

    These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…

  13. An automatic system for acidity determination based on sequential injection titration and the monosegmented flow approach.

    Science.gov (United States)

    Kozak, Joanna; Wójtowicz, Marzena; Gawenda, Nadzieja; Kościelniak, Paweł

    2011-06-15

    An automatic sequential injection system, combining monosegmented flow analysis, sequential injection analysis and sequential injection titration is proposed for acidity determination. The system enables controllable sample dilution and generation of standards of required concentration in a monosegmented sequential injection manner, sequential injection titration of the prepared solutions, data collecting, and handling. It has been tested on spectrophotometric determination of acetic, citric and phosphoric acids with sodium hydroxide used as a titrant and phenolphthalein or thymolphthalein (in the case of phosphoric acid determination) as indicators. Accuracy better than |4.4|% (RE) and repeatability better than 2.9% (RSD) have been obtained. It has been applied to the determination of total acidity in vinegars and various soft drinks. The system provides low sample (less than 0.3 mL) consumption. On average, analysis of a sample takes several minutes. PMID:21641455

  14. Design for Automatic Fire Alarm and Linkage Control System in A Data Center%某数据中心火灾自动报警及联动控制系统设计

    Institute of Scientific and Technical Information of China (English)

    王绍红; 陶悦

    2015-01-01

    A data center is taken as an example to introduce the design for the automatic fire alarm system in a data center, focusing on the setting of detector and fire linkage system in the building: a linkage gas fire extinguishing system with aspirating smoke detector and smoke detector is adopted in the data room, and a linkage water cannon fire extinguishing system with dual-band infrared flame detector is adopted in the sharing atrium; and then, the linkage control procedures for gas fire extinguishing system, high-pressure water mist fire extinguishing system, and intelligent fire extinguishing system in large space are introduced.%以某数据中心工程为例,介绍数据中心的火灾自动报警系统设计,着重阐述建筑物内探测器及联动灭火系统的设置:数据机房采用吸气式感烟火灾探测器与感烟探测器的组合联动气体灭火系统;共享中庭采用双波段红外火焰探测器联动自动水炮灭火系统。并介绍了高压细水雾灭火系统、气体灭火系统、大空间智能灭火系统的联动控制程序。

  15. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand’s Official Statistics System

    Directory of Open Access Journals (Sweden)

    Frank Pega

    2013-01-01

    Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.

  16. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  17. Automatic radiation measuring system connected with GPS

    International Nuclear Information System (INIS)

    The most serious nuclear disaster in Japan has broken out at Fukushima Daiichi Nuclear Power Plant due to Great East Japan Earthquake. Prompt and exact mapping of the contamination is of great importance for radiation protection and for the environment restoration. We have developed radiation survey systems KURAMA and KURAMA-2 for rapid and exact measurement of radiation dose distribution. The system is composed of a mobile radiation monitor and the computer in office which is for the storage and visualization of the data. They are connected with internet and are operated for continuous radiation measurement while the monitor is moving. The mobile part consists of a survey meter, an interface to transform the output of the survey meter for the computer, a global positioning system, a computer to process the data for connecting to the network, and a mobile router. Thus they are effective for rapid mapping of the surface contamination. The operation and the performance of the equipment at the site are presented. (J.P.N.)

  18. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    complicates the analysis instead and contributes to model inadequacy. As such, scatter can be considered as an example of element-wise outliers. However, no straightforward method for identifying the scatter region can be found in the literature. In this paper an automatic scatter identification method is...... input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  19. Longline Observer Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — LODS, the Hawaii Longline Observer Data System, is a complete suite of tools designed to collect, process, and manage quality fisheries data and information. Guided...

  20. Development of automatic cross section compilation system for MCNP

    International Nuclear Information System (INIS)

    A development of a code system to automatically convert cross-sections for MCNP is in progress. The NJOY code is, in general, used to convert the data compiled in the ENDF format (Evaluated Nuclear Data Files by BNL) into the cross-section libraries required by various reactor physics codes. While the cross-section library: FSXLIB-J3R2 was already converted from the JENDL-3.2 version of Japanese Evaluated Nuclear Data Library for a continuous energy Monte Carlo code MCNP, the library keeps only the cross-sections at room temperature (300 K). According to the users requirements which want to have cross-sections at higher temperature, say 600 K or 900 K, a code system named 'autonj' is under development to provide a set of cross-section library of arbitrary temperature for the MCNP code. This system can accept any of data formats adopted JENDL that may not be treated by NJOY code. The input preparation that is repeatedly required at every nuclide on NJOY execution is greatly reduced by permitting the conversion process of as many nuclides as the user wants in one execution. A few MCNP runs were achieved for verification purpose by using two libraries FSXLIB-J3R2 and the output of autonj'. The almost identical MCNP results within the statistical errors show the 'autonj' output library is correct. In FY 1998, the system will be completed, and in FY 1999, the user's manual will be published. (K. Tsuchihashi)

  1. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  2. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas

    OpenAIRE

    Hsien-Tsung Chang; Yi-Ming Chang; Meng-Tze Tsai

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intention...

  3. Automatic graphene transfer system for improved material quality and efficiency

    OpenAIRE

    Alberto Boscá; Jorge Pedrós; Javier Martínez; Tomás Palacios; Fernando Calle

    2015-01-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The proce...

  4. Towards Automatic Music Transcription: Extraction of MIDI-Data out of Polyphonic Piano Music

    Directory of Open Access Journals (Sweden)

    Jens Wellhausen

    2005-06-01

    Full Text Available Driven by the increasing amount of music available electronically the need of automatic search and retrieval systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications and music analysis. The first part of the algorithm performs a note accurate temporal audio segmentation. The resulting segments are examined to extract the notes played in the second part. An algorithm for chord separation based on Independent Subspace Analysis is presented. Finally, the results are used to build a MIDI file.

  5. A Wireless Framework for Lecturers' Attendance System with Automatic Vehicle Identification (AVI Technology

    Directory of Open Access Journals (Sweden)

    Emammer Khamis Shafter

    2015-10-01

    Full Text Available Automatic Vehicle Identification (AVI technology is one type of Radio Frequency Identification (RFID method which can be used to significantly improve the efficiency of lecturers' attendance system. It provides the capability of automatic data capture for attendance records using mobile device equipped in users’ vehicle. The intent of this article is to propose a framework for automatic lecturers' attendance system using AVI technology. The first objective of this work involves gathering of requirements for Automatic Lecturers' Attendance System and to represent them using UML diagrams. The second objective is to put forward a framework that will provide guidelines for developing the system. A prototype has also been created as a pilot project.

  6. Possibility of Using non-standard DICOM Objects Stored in the Systems for Collecting and Storing Image Data (PACS)

    OpenAIRE

    Kudelka, Kamil

    2013-01-01

    This thesis is concerned with the systems dedicated to picture archiving and communication (PACS), their architecture, workflow and opportunities of Dicom objects and the characteristics of industry standard DICOM which the PACS is based on. It works with idea that these systems are already behind the border of image data and serve still more and more as a storing space for the generic data saved in Dicom. Using Dicom RT objects as sources of billing information which can be done fully automa...

  7. Real-time directional wave data collection

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G; Pednekar, P.S.

    which gives a success rate of 93.7%. At 4 locations 100% data had been collected. In 13 locations considered, 26 deployments and 5 adrift were encountered during the data collection period. The problems associated with wave measurements...

  8. A Diffractometer Control System with Automatic UB-matrix Refinement

    International Nuclear Information System (INIS)

    A four-axes diffractometer control system with automatic UB-matrix refinement has been developed. The system automatically scans and finds peak positions after specifying a set of indexes, then the system calculates the UB-martix. Combining with continuous scan, the control system reduces a UB-matrix determination time. The system was developed based on a message exchanging control framework. Users can control not only a diffractometer but also other components like a monochromator and an insertion device using man-readable abstract messages. Therefore users easily develop programs for their experiments. Results of test measurements at the beamline BL46XU of the SPring-8 indicate that the automatic UB-matrix refinement is working well, and it reduces the refinement time of the UB-matrix about one fourth

  9. Automatic Road Centerline Extraction from Imagery Using Road GPS Data

    Directory of Open Access Journals (Sweden)

    Chuqing Cao

    2014-09-01

    Full Text Available Road centerline extraction from imagery constitutes a key element in numerous geospatial applications, which has been addressed through a variety of approaches. However, most of the existing methods are not capable of dealing with challenges such as different road shapes, complex scenes, and variable resolutions. This paper presents a novel method for road centerline extraction from imagery in a fully automatic approach that addresses the aforementioned challenges by exploiting road GPS data. The proposed method combines road color feature with road GPS data to detect road centerline seed points. After global alignment of road GPS data, a novel road centerline extraction algorithm is developed to extract each individual road centerline in local regions. Through road connection, road centerline network is generated as the final output. Extensive experiments demonstrate that our proposed method can rapidly and accurately extract road centerline from remotely sensed imagery.

  10. Automatic Generation of OWL Ontology from XML Data Source

    CERN Document Server

    Yahia, Nora; Ahmed, AbdelWahab

    2012-01-01

    The eXtensible Markup Language (XML) can be used as data exchange format in different domains. It allows different parties to exchange data by providing common understanding of the basic concepts in the domain. XML covers the syntactic level, but lacks support for reasoning. Ontology can provide a semantic representation of domain knowledge which supports efficient reasoning and expressive power. One of the most popular ontology languages is the Web Ontology Language (OWL). It can represent domain knowledge using classes, properties, axioms and instances for the use in a distributed environment such as the World Wide Web. This paper presents a new method for automatic generation of OWL ontology from XML data sources.

  11. Remanufacturing system based on totally automatic MIG surfacing via robot

    Institute of Scientific and Technical Information of China (English)

    ZHU Sheng; GUO Ying-chun; YANG Pei

    2005-01-01

    Remanufacturing system is a term of green system project which conforms to the national sustainable development strategy. With the demand of the high adaptability of the varieties of waste machining parts, the short product cycle, the low machining cost and the high product quality are offered. Each step of the remanufacturing system from the beginning of the scanning to the accomplishment of the welding was investigted. Aiming at building a remanufacturing system based on totally automatic MIG surfacing via robot, advanced information technology, remanufacturing technology and management, through the control of the pretreatment and the optimization to minimize the time of remanufacturing and realize the remanufacturing on the terminal products of varieties, were applied. The steps mainly include: 1) using the visual sensor which is installed at the end of the Robot to rapidly get the outline data of the machining part and the pretreatment of the data; 2) rebuilding the curved surface based on the outline data and the integrated CAD material object model; 3) building the remanufacturing model based on the CAD material object model and projecting the remanufacturing process; and 4) accomplishing the remanufacture of the machining part by the technology of MIG surfacing.

  12. Automatic Management of Parallel and Distributed System Resources

    Science.gov (United States)

    Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.

    1990-01-01

    Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.

  13. Automatic control system of the radiometric system for inspection of large-scale vehicles and cargoes

    International Nuclear Information System (INIS)

    The automatic control system (ACS) is intended to control the equipment of the radiometric inspection system in the normal operating modes as well as during the preventive maintenance, maintenance/repair and adjustment works; for acquisition of the data on the status of the equipment, reliable protection of the personnel and equipment, acquisition, storage and processing of the results of operation and to ensure service maintenance.

  14. 40 CFR 51.365 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Data collection. 51.365 Section 51.365 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR....365 Data collection. Accurate data collection is essential to the management, evaluation,...

  15. Innovative Data Collection Strategies in Qualitative Research

    Science.gov (United States)

    Onwuegbuzie, Anthony J.; Leech, Nancy L.; Collins, Kathleen M. T.

    2010-01-01

    This article provides an innovative meta-framework comprising strategies designed to guide qualitative data collection in the 21st century. We present a meta-framework comprising strategies for collecting data from interviews, focus groups, observations, and documents/material culture. We present a template for collecting nonverbal data during…

  16. Automatic classification of oranges using image processing and data mining techniques

    OpenAIRE

    Mercol, Juan Pablo; Gambini, María Juliana; Santos, Juan Miguel

    2008-01-01

    Data mining is the discovery of patterns and regularities from large amounts of data using machine learning algorithms. This can be applied to object recognition using image processing techniques. In fruits and vegetables production lines, the quality assurance is done by trained people who inspect the fruits while they move in a conveyor belt, and classify them in several categories based on visual features. In this paper we present an automatic orange’s classification system, which us...

  17. Truck Roll Stability Data Collection and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, SS

    2001-07-02

    The principal objective of this project was to collect and analyze vehicle and highway data that are relevant to the problem of truck rollover crashes, and in particular to the subset of rollover crashes that are caused by the driver error of entering a curve at a speed too great to allow safe completion of the turn. The data are of two sorts--vehicle dynamic performance data, and highway geometry data as revealed by vehicle behavior in normal driving. Vehicle dynamic performance data are relevant because the roll stability of a tractor trailer depends both on inherent physical characteristics of the vehicle and on the weight and distribution of the particular cargo that is being carried. Highway geometric data are relevant because the set of crashes of primary interest to this study are caused by lateral acceleration demand in a curve that exceeds the instantaneous roll stability of the vehicle. An analysis of data quality requires an evaluation of the equipment used to collect the data because the reliability and accuracy of both the equipment and the data could profoundly affect the safety of the driver and other highway users. Therefore, a concomitant objective was an evaluation of the performance of the set of data-collection equipment on the truck and trailer. The objective concerning evaluation of the equipment was accomplished, but the results were not entirely positive. Significant engineering apparently remains to be done before a reliable system can be fielded. Problems were identified with the trailer to tractor fiber optic connector used for this test. In an over-the-road environment, the communication between the trailer instrumentation and the tractor must be dependable. In addition, the computer in the truck must be able to withstand the rigors of the road. The major objective--data collection and analysis--was also accomplished. Using data collected by instruments on the truck, a ''bad-curve'' database can be generated. Using

  18. Automatic combustion control system for coke oven battery

    Energy Technology Data Exchange (ETDEWEB)

    Kasaoka, S.; Terazono, K.; Hashimoto, K.; Matsuda, H.

    1984-01-01

    This paper outlines an automatic coke battery temperature control system. The temperature sensors used, their number and location are described. There are three control systems: the combustion control system, temperature detection and heat control system, and the air volume and excess air-ratio control. The system for setting the battery temperature is also described. The overall system has achieved substantial reduction in coking heat consumption. 3 references.

  19. Can an online clinical data management service help in improving data collection and data quality in a developing country setting?

    Directory of Open Access Journals (Sweden)

    Wildeman Maarten A

    2011-08-01

    Full Text Available Background Data collection by Electronic Medical Record (EMR systems have been proven to be helpful in data collection for scientific research and in improving healthcare. For a multi-centre trial in Indonesia and the Netherlands a web based system was selected to enable all participating centres to easily access data. This study assesses whether the introduction of a Clinical Trial Data Management service (CTDMS composed of electronic Case Report Forms (eCRF can result in effective data collection and treatment monitoring. Methods Data items entered were checked for inconsistencies automatically when submitted online. The data were divided into primary and secondary data items. We analysed both the total number of errors and the change in error rate, for both Primary and Secondary items, over the first five month of the trial. Results In the first five months 51 patients were entered. The Primary data error rate was 1.6%, whilst that for Secondary data was 2.7% against acceptable error rates for analysis of 1% and 2.5% respectively. Conclusion The presented analysis shows that after five months since the introduction of the CTDMS the Primary and Secondary data error rates reflect acceptable levels of data quality. Furthermore, these error rates were decreasing over time. The digital nature of the CTDMS, as well as the online availability of that data, gives fast and easy insight in adherence to treatment protocols. As such, the CTDMS can serve as a tool to train and educate medical doctors and can improve treatment protocols.

  20. Information Collection System of Crop Growth Environment Based on the Internet of Things

    Institute of Scientific and Technical Information of China (English)

    Hua; YU; Guangyu; ZHANG; Ningbo; LU

    2013-01-01

    Based on the technology of Internet of things, for the issues of large amount data acquisition and difficult real time transport in the data acquisition of crop growth environment, this paper designs one information collection system for crop growth environment. Utilizing the range free location mechanism which defines the node position and GEAR routing mechanism give solutions to the problems of node location, routing protocol applications and so on. This system can realize accurate and automatic real time collection, aggregation and transmission of crop growth environment information, and can achieve the automation of agricultural production, to the maximum extent.

  1. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) from January 1, 1995 to August 1, 2011 (NCEI Accession 0052765)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 26...

  2. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) from January 1, 1995 to August 1, 2011 (NODC Accession 0052765)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 26...

  3. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  4. Tritium monitor and collection system

    Energy Technology Data Exchange (ETDEWEB)

    Baker, J.D.; Wickham, K.L.; Ely, W.E.; Tuggle, D.G.; Meikrantz, D.H.; Grafwaller, E.G.; Maltrud, H.R.; Bourne, G.L.

    1991-03-26

    This system measures tritium on-line and collects tritium from a flowing inert gas stream. It separates the tritium from other non-hydrogen isotope contaminating gases, whether radioactive or not. The collecting portion of the system is constructed of various zirconium alloys called getters. These alloys adsorb tritium in any of its forms at one temperature and at a higher temperature release it as a gas. The system consists of four on-line getters and heaters, two ion chamber detectors, two collection getters, and two guard getters. When the incoming gas stream is valved through the on-line getters, 99.9% of it is adsorbed and the remainder continues to the guard getter where traces of tritium not collected earlier are adsorbed. The inert gas stream then exits the system to the decay chamber. Once the on-line getter has collected tritium for a predetermined time, it is valved off and the next online getter is valved on. Simultaneously, the first getter is heated and a pure helium purge is employed to carry the tritium from the getter. The tritium loaded gas stream is then routed through an ion chamber which measures the tritium activity. The ion chamber effluent passes through a collection getter that readsorbs the tritium and is removable from the system once it is loaded and is then replaced with a clean getter. Prior to removal of the collection getter, the system switches to a parallel collection getter. The effluent from the collection getter passes through a guard getter to remove traces of tritium prior to exiting the system. The tritium loaded collection getter, once removed, is analyzed by liquid scintillation techniques. The entire sequence is under computer control except for the removal and analysis of the collection getter.

  5. Tritium monitor and collection system

    Science.gov (United States)

    Bourne, Gary L.; Meikrantz, David H.; Ely, Walter E.; Tuggle, Dale G.; Grafwallner, Ervin G.; Wickham, Keith L.; Maltrud, Herman R.; Baker, John D.

    1992-01-01

    This system measures tritium on-line and collects tritium from a flowing inert gas stream. It separates the tritium from other non-hydrogen isotope contaminating gases, whether radioactive or not. The collecting portion of the system is constructed of various zirconium alloys called getters. These alloys adsorb tritium in any of its forms at one temperature and at a higher temperature release it as a gas. The system consists of four on-line getters and heaters, two ion chamber detectors, two collection getters, and two guard getters. When the incoming gas stream is valved through the on-line getters, 99.9% of it is adsorbed and the remainder continues to the guard getter where traces of tritium not collected earlier are adsorbed. The inert gas stream then exits the system to the decay chamber. Once the on-line getter has collected tritium for a predetermined time, it is valved off and the next on-line getter is valved on. Simultaneously, the first getter is heated and a pure helium purge is employed to carry the tritium from the getter. The tritium loaded gas stream is then routed through an ion chamber which measures the tritium activity. The ion chamber effluent passes through a collection getter that readsorbs the tritium and is removable from the system once it is loaded and is then replaced with a clean getter. Prior to removal of the collection getter, the system switches to a parallel collection getter. The effluent from the collection getter passes through a guard getter to remove traces of tritium prior to exiting the system. The tritium loaded collection getter, once removed, is analyzed by liquid scintillation techniques. The entire sequence is under computer control except for the removal and analysis of the collection getter.

  6. CrespoDynCoopNet DATA Collections

    OpenAIRE

    Crespo Solana, Ana; Sánchez-Crespo Camacho, Juan Manuel; Maestre Martínez, Roberto

    2010-01-01

    The collected data are stored into a Microsoft Access® database that has been designed to be physically integrated into a GIS system. The main structure of this initial database is built around the main table, named ‘AGENTS’, in which all biographic data related to the individual agents are entered taking into account the various ‘worlds’ each agent belongs to – social, economic etc. An individual study and classification has been carried out for each agent; then an attempt has been made to u...

  7. Science data collection with polarimetric SAR

    DEFF Research Database (Denmark)

    Dall, Jørgen; Woelders, Kim; Madsen, Søren Nørvang

    1996-01-01

    Discusses examples on the use of polarimetric SAR in a number of Earth science studies. The studies are presently being conducted by the Danish Center for Remote Sensing. A few studies of the European Space Agency's EMAC programme are also discussed. The Earth science objectives are presented, an...... the potential of polarimetric SAR is discussed and illustrated with data collected by the Danish airborne EMISAR system during a number of experiments in 1994 and 1995. The presentation will include samples of data acquired for the different studies...

  8. Automatic operation of radioactive solid waste interim storage system in nuclear power plants

    International Nuclear Information System (INIS)

    Mitsubishi Heavy Industries, Ltd. (MHI) has developed an automatic system to transport and store on site the low-level radioactive wastes generated in nuclear power plants. This system consists mainly of palletizing equipment, unmanned forklift trucks and a data control system. The system has realized the unmanned and labour-saving operations in storage facilities with minimized radiation exposure and maximized operation and storage efficiencies. As for the unmanned forklift trucks, the inertia guidance system, which is superior in performance, maintainability and economy, has been newly developed and applied in addition to the previously actualized wire-guided forklift trucks. Moreover, an automatic drum data reader using an image processing technique and full automatic inspection equipment for drummed wastes, which are in the final stages of their development, are introduced. (author)

  9. Automatic Voltage Control (AVC) System under Uncertainty from Wind Power

    DEFF Research Database (Denmark)

    Qin, Nan; Abildgaard, Hans; Flynn, Damian; Rather, Zakir Hussain; Bak, Claus Leth; Chen, Zhe

    2016-01-01

    An automatic voltage control (AVC) system maintains the voltage profile of a power system in an acceptable range and minimizes the operational cost by coordinating the regulation of controllable components. Typically, all of the parameters in the optimization problem are assumed to be certain and...

  10. Automatic calibration system for VENUS lead glass counters

    International Nuclear Information System (INIS)

    Automatic calibration system for VENUS lead glass counters has been constructed. It consists of a moving table, position sensors, control electronics and a master minicomputer, (micro-11 of DEC). The system has been well operated for six months and one third of VENUS lead glass counters have been calibrated. (author)

  11. HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation

    Science.gov (United States)

    Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.

    2006-03-01

    As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.

  12. Automatic power distribution system for Okinawa Electric Power Co.; Okinawa Denryoku (kabu) nonyu haiden jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-02-29

    The open distributed automatic power distribution systems were delivered to Naha and Gushikawa branches of Okinawa Electric Power Co. This system adopts such latest technologies as object-oriented design. Its features are as follows: (1) Possible parallel operation by every branch and business office by switching an operation priority between the branch and business office in the case of multi- accidents, (2) Possible free console operation for any businesses regardless of the other console conditions, (3) Automatic decision of power supply by mobile power vehicle for precise power interruption control, (4) Immediate display of the work planning system including data maintenance and operation procedures until the prearranged working date, and (5) Possible manned backup operation at system down of the server. (translated by NEDO)

  13. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  14. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  15. Evolutionary synthesis of automatic classification on astroinformatic big data

    Science.gov (United States)

    Kojecky, Lumir; Zelinka, Ivan; Saloun, Petr

    2016-06-01

    This article describes the initial experiments using a new approach to automatic identification of Be and B[e] stars spectra in large archives. With enormous amount of these data it is no longer feasible to analyze it using classical approaches. We introduce an evolutionary synthesis of the classification by means of analytic programming, one of methods of symbolic regression. By this method, we synthesize the most suitable mathematical formulas that approximate chosen samples of the stellar spectra. As a result is then selected the category whose formula has the lowest difference compared to the particular spectrum. The results show us that classification of stellar spectra by means of analytic programming is able to identify different shapes of the spectra.

  16. Modular system for automatic control of buildings

    OpenAIRE

    Tavčar, Marko

    2015-01-01

    In this thesis, we are going to describe a modular system for control of buildings with a wide variety of actuators. We will primarily focus on control of lamps and motor shades, and add general support for all sorts of sensors too. Human-machine interface consist of mobile application for operating systems Android, which connects to the control system. We are going to present and explain the system architecture including the description of the hardware and software solutions. We designed the...

  17. Reliability of the TJ-II power supply system: Collection and analysis of the operational experience data

    International Nuclear Information System (INIS)

    During a TJ-II pulse, the provision of magnetic fields requires a total amount of power exceeding 80 MVA. Such amount of power is supplied by a 132 MVA flywheel generator (15 kV output voltage, 80-100 Hz output frequency) and the related motor, transformers, breakers, rectifiers, regulators, protections, busbars, connections, etc. Failure data of these main components have been collected identified and processed including information on failure modes and, where possible, causes of the failures. Main statistical values about failure rates for the period from May of 1998 to December of 2004 have been calculated and are ready to be compared with those of the International Fusion Component Failure Rate Database (FCFR-DB)

  18. Design of automatic leveling and centering system of theodolite

    Science.gov (United States)

    Liu, Chun-tong; He, Zhen-Xin; Huang, Xian-xiang; Zhan, Ying

    2012-09-01

    To realize the theodolite automation and improve the azimuth Angle measurement instrument, the theodolite automatic leveling and centering system with the function of leveling error compensation is designed, which includes the system solution, key components selection, the mechanical structure of leveling and centering, and system software solution. The redesigned leveling feet are driven by the DC servo motor; and the electronic control center device is installed. Using high precision of tilt sensors as horizontal skew detection sensors ensures the effectiveness of the leveling error compensation. Aiming round mark center is located using digital image processing through surface array CCD; and leveling measurement precision can reach the pixel level, which makes the theodolite accurate centering possible. Finally, experiments are conducted using the automatic leveling and centering system of the theodolite. The results show the leveling and centering system can realize automatic operation with high centering accuracy of 0.04mm.The measurement precision of the orientation angle after leveling error compensation is improved, compared with that of in the traditional method. Automatic leveling and centering system of theodolite can satisfy the requirements of the measuring precision and its automation.

  19. A Study of Applications of Multiagent System Specificaitons and the Key Techniques in Automatic Abstracts System

    Institute of Scientific and Technical Information of China (English)

    HUShun-geng; ZHONGYi-xin

    2001-01-01

    In this thesis, multiagent system specifications, multiagent system architectures, agent communica-tion languages and agent communication protocols, automatic abstracting based on multiagent technolo-gies are studied.Some concerned problems of de-signs and realization of automatic abstracting sys-tems based on multiagent technologies are strdied, too.Chapter 1 shows the significance and objectives of the thesis, its main contents are summarized, and innovations of the thesis are showed.Some basic concepts of agents and multiagent systems are stud-ied in Chapter2.The definitions of agents and mul-tiagent systems are given, and the theory, technolo-gies and applications of multiagent systems are sum-marized .Furthermore, some important studying trends of multiagent systems are set forward.Multi-agent system specifications are strdied in Chapter30MAS/KIB-a multiagent system specification is built using mental states such as K(Know), B(Be-lief), and I(Intention), its grammar and seman-teme are discussed, axioms and inference rules are given, and some properties are researched.We also compare MAS/KIB with other existing specifica-tions.MAS/KIB has the following characteristicsL1)each agent has its own world outlood;(2)no global data in the system;(3)processes of state changes are used as indexes to systems;(4)it has the characteristics of not only time series logic but also dynamic logic;and (5) interactive actions are included.The architectures of multiagent systems are studied in Chapter 4.First, we review some typical architecture of multiagent systems, agent network architecture, agent federated architecture, agent blackboard architenture ,and Foundation of Intelligent Physical Agent(FIPA) architecture.For the first time, we set forward and study the layering and partitioning models of the architectures of multi-agent systems,organizing architecture models, and interoperability architecture model of multiagent sys-tems .Chapter 5 studies agent communication lan

  20. Development and application of an automatic system for measuring the laser camera

    International Nuclear Information System (INIS)

    Objective: To provide an automatic system for measuring imaging quality of laser camera, and to make an automatic measurement and analysis system. Methods: On the special imaging workstation (SGI 540), the procedure was written by using Matlab language. An automatic measurement and analysis system of imaging quality for laser camera was developed and made according to the imaging quality measurement standard of laser camera of International Engineer Commission (IEC). The measurement system used the theories of digital signal processing, and was based on the characteristics of digital images, as well as put the automatic measurement and analysis of laser camera into practice by the affiliated sample pictures of the laser camera. Results: All the parameters of imaging quality of laser camera, including H-D and MTF curve, low and middle and high resolution of optical density, all kinds of geometry distort, maximum and minimum density, as well as the dynamic range of gray scale, could be measured by this system. The system was applied for measuring the laser cameras in 20 hospitals in Beijing. The measuring results showed that the system could provide objective and quantitative data, and could accurately evaluate the imaging quality of laser camera, as well as correct the results made by manual measurement based on the affiliated sample pictures of the laser camera. Conclusion: The automatic measuring system of laser camera is an effective and objective tool for testing the quality of the laser camera, and the system makes a foundation for the future research

  1. Midterm Report on Data Collection

    DEFF Research Database (Denmark)

    Gelsing, Lars; Linde, Lisbeth Tved

    In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started.......In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started....

  2. Shift Control System of Heavy-duty Vehicle Automatic Transmission

    OpenAIRE

    Yan Zhang; Wenxing Ma; Xuesong Li

    2013-01-01

    Heavy-duty vehicle hydrodynamic mechanical automatic transmission shifting operation system was designed, mathematical model of its simplified hydraulic system was established and simulation model of shifting operation system was established with AMESim, the simulation experiment was carried out, then oil pressure curves of each clutch hydraulic cylinder were obtained when giving forward gear or reverse gear signals. The simulation results show that shifting operating system meets the design ...

  3. BladePro: 3D Automatic Grade Control System

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    Currently most of the methods to control grading equipment to achieve the required production accuracy are based on conventional surveying, such as grade stakes and stringlines. BladePro System, developed by Spectra Precision in 1998, is a dual automatic blade control system that uses advance computer technology and user friendly operator controls. This system provides contractors a three dimensional machine control system for roads, railway beds and airport runway construction.

  4. Microcontroller based automatic liquid poison addition control system

    International Nuclear Information System (INIS)

    Microcontrollers are finding increasing applications in instrumentation where complex digital circuits can be substituted by a compact and simple circuit, thus enhancing the reliability. In addition to this, intelligence and flexibility can be incorporated. For applications not requiring large amount of read/write memory (RAM), microcontrollers are ideally suited since they contain programmable memory (Eprom), parallel input/output lines, data memory, programmable timers and serial interface ports in one chip. This paper describes the design of automatic liquid poison addition control system (ALPAS) using intel's 8 bit microcontroller 8751, which is used to generate complex timing control sequence signals for liquid poison addition to the moderator in a nuclear reactor. ALPAS monitors digital inputs coming from protection system and regulating system of a nuclear reactor and provides control signals for liquid poison addition for long term safe shutdown of the reactor after reactor trip and helps the regulating system to reduce the power of the reactor during operation. Special hardware and software features have been incorporated to improve performance and fault detection. (author)

  5. AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA)

    OpenAIRE

    Veena G.S; Chandrika Prasad; Khaleel K

    2013-01-01

    The proposed work aims to create a smart application camera, with the intention of eliminating the need for a human presence to detect any unwanted sinister activities, such as theft in this case. Spread among the campus, are certain valuable biometric identification systems at arbitrary locations. The application monitosr these systems (hereafter referred to as “object”) using our smart camera system based on an OpenCV platform. By using OpenCV Haar Training, employing the Vio...

  6. Automatic Extraction of Mangrove Vegetation from Optical Satellite Data

    Science.gov (United States)

    Agrawal, Mayank; Sushma Reddy, Devireddy; Prasad, Ram Chandra

    2016-06-01

    Mangrove, the intertidal halophytic vegetation, are one of the most significant and diverse ecosystem in the world. They protect the coast from sea erosion and other natural disasters like tsunami and cyclone. In view of their increased destruction and degradation in the current scenario, mapping of this vegetation is at priority. Globally researchers mapped mangrove vegetation using visual interpretation method or digital classification approaches or a combination of both (hybrid) approaches using varied spatial and spectral data sets. In the recent past techniques have been developed to extract these coastal vegetation automatically using varied algorithms. In the current study we tried to delineate mangrove vegetation using LISS III and Landsat 8 data sets for selected locations of Andaman and Nicobar islands. Towards this we made an attempt to use segmentation method, that characterize the mangrove vegetation based on their tone and the texture and the pixel based classification method, where the mangroves are identified based on their pixel values. The results obtained from the both approaches are validated using maps available for the region selected and obtained better accuracy with respect to their delineation. The main focus of this paper is simplicity of the methods and the availability of the data on which these methods are applied as these data (Landsat) are readily available for many regions. Our methods are very flexible and can be applied on any region.

  7. Meeting Expanding Needs to Collect Food Intake Specificity: The Nutrition Data System for Research (NDS-R)

    Science.gov (United States)

    VanHeel, Nancy; Pettit, Janet; Rice, Barbara; Smith, Scott M.

    2003-01-01

    Food and nutrient databases are populated with data obtained from a variety of sources including USDA Reference Tables, scientific journals, food manufacturers and foreign food tables. The food and nutrient database maintained by the Nutrition Coordinating Center (NCC) at the University of Minnesota is continually updated with current nutrient data and continues to be expanded with additional nutrient fields to meet diverse research endeavors. Data are strictly evaluated for reliability and relevance before incorporation into the database; however, the values are obtained from various sources and food samples rather than from direct chemical analysis of specific foods. Precise nutrient values for specific foods are essential to the nutrition program at the National Aeronautics and Space Administration (NASA). Specific foods to be included in the menus of astronauts are chemically analyzed at the Johnson Space Center for selected nutrients. A request from NASA for a method to enter the chemically analyzed nutrient values for these space flight food items into the Nutrition Data System for Research (NDS-R) software resulted in modification of the database and interview system for use by NASA, with further modification to extend the method for related uses by more typical research studies.

  8. A ORACLE-based system for data collection, storage and analysis of main equipment load factors in NPPs and TPPs

    International Nuclear Information System (INIS)

    This data base is developed by the National Electricity Company, Sofia (BG) as an aid to supervision, analysis and administration decision making in a variety of operational situations in NPPs and TPPs. As major indicators of the equipment condition the following primary data are stored: steam or electricity production per month; operation hours per month; equipment stand-by outages; planned outages; unplanned permitted maintenance outages; unplanned emergency maintenance outages; number of outages of the unit per month. These data cover the period from the putting of the corresponding equipment into operation till the present moment, i.e. or about 32 years. The data up to 1990 are annual and for the last three years - monthly. Based on these primary data, the following quantities are calculated: average capacity; average load factors; operation time factors - total and accounting for the planned and the permitted unplanned outages; unpermitted outages factors - total and accounting for the planned and the permitted outages. All the factors are calculated on user's request for a chosen time period, by summing up correspondingly the major indicators (production, operation hours and various outages) for the given period. The system operates on an IBM 4341 under VM/SP and DB ORACLE V.5. The input is entered directly from the TPP and NPP by telex lines from PCs, operating also as telex machines, into the mainframe of Energokibernetika Ltd. They are available to all authorised users from local terminals or PCs, connected to the computer by synchronous or asynchronous lines. A system for data transmission to remote users along commutated telephone lines is also developed. (R. Ts.)

  9. Analysis on the Influence of Automatic Station Temperature Data on the Sequence Continuity of Historical Meteorological Data

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation peri...

  10. Design and development of real time data push in web-based automatic irrigation control system%基于Web的自动灌溉控制系统数据实时推送设计与开发

    Institute of Scientific and Technical Information of China (English)

    李淑华; 郝星耀; 周清波; 潘瑜春

    2015-01-01

    The automatic irrigation control system based on web is a main trend of current water-saving technology development. In order to provide personalized irrigation control scheme and precise water metering, the system needs higher real-time data transmission performance. The real-time performance of web application is currently poor, and difficult to meet the needs of accurate irrigation control. Aiming at this problem, in this paper, the structure and bottleneck of real-time data transmission of web-based automatic irrigation control system was analyzed, and the data push scheme of improving the real-time performance was proposed. Based on observer pattern, the data push mode between data layer and logic layer, and that between logic layer and presentation layer were specifically designed. In the former data transmission process, the observed object is database table, and the observer is Web server monitoring program. After the data is inserted into the data table, the database immediately triggers the stored procedure to notify the relevant Web service program and executes updating data subject. In the latter data transmit process, the observed object is program object running on the Web server, and the observer is client program object running on the browser. Because the Web service program cannot directly initiate data connection to the client program. Therefore, in order to implement the observer pattern, it is essential to establish a real-time tow-way data connection in the client program loading process. Then through subscribing a group of data subjects, the client program can receive real-time data push as soon as the data subjects are updated. The connection between client and Web server is established and maintained by client connect request, then a streaming connection is evoked by client through which Web server streams data down to the client with no poll overhead. But these client-to-server messages are not sent over the streaming connection, instead an

  11. New Approaches to Demographic Data Collection

    OpenAIRE

    Treiman, Donald J.; Lu, Yao; Qi, Yaqiang

    2012-01-01

    As population scientists have expanded the range of topics they study, increasingly considering the interrelationship between population phenomena and social, economic, and health conditions, they have expanded the kinds of data collected and have brought to bear new data collection techniques and procedures, often borrowed from other fields. These new approaches to demographic data collection are the concern of this essay. We consider three main topics: new developments in sampling procedure...

  12. Automatic blood pressure measuring system (M092)

    Science.gov (United States)

    Nolte, R. W.

    1977-01-01

    The Blood Pressure Measuring System is described. It measures blood pressure by the noninvasive Korotkoff sound technique on a continual basis as physical stress is imposed during experiment M092, Lower Body Negative Pressure, and experiment M171, Metabolic Activity.

  13. Automatic blood pressure measuring system (M091)

    Science.gov (United States)

    1977-01-01

    The Leg Volume Measuring System is used to measure leg calf girth changes that occur during exposure to lower body negative pressure as a result of pooling of blood and other fluids in the lower extremities.

  14. Automatic calorimetry system monitors RF power

    Science.gov (United States)

    Harness, B. W.; Heiberger, E. C.

    1969-01-01

    Calorimetry system monitors the average power dissipated in a high power RF transmitter. Sensors measure the change in temperature and the flow rate of the coolant, while a multiplier computes the power dissipated in the RF load.

  15. Automatic Tracking Evaluation and Development System (ATEDS)

    Data.gov (United States)

    Federal Laboratory Consortium — The heart of the ATEDS network consists of four SGI Octane computers running the IRIX operating system and equipped with V12 hardware graphics to support synthetic...

  16. 06091 Abstracts Collection -- Data Structures

    OpenAIRE

    Arge, Lars; Sedgewick, Robert; Wagner, Dorothea

    2006-01-01

    From 26.02.06 to 03.03.06, the Dagstuhl Seminar 06091 ``Data Structures'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goa...

  17. Collective Classification in Network Data

    OpenAIRE

    Sen, Prithviraj; Namata, Galileo; Bilgic, Mustafa; Getoor, Lise; University of Maryland; Galligher, Brian; Eliassi-Rad, Tina

    2008-01-01

    Many real-world applications produce networked data such as the world-wide web (hypertext documents connected via hyperlinks), social networks (for example, people connected by friendship links), communication networks (computers connected via communication links) and biological networks (for example, protein interaction networks). A recent focus in machine learning research has been to extend traditional machine learning classification techniques to classify nodes in such networks. In this a...

  18. Examination techniques of the automatics fire detection monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Yon Woo [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-04-01

    The variety of the automatic fire detection monitoring systems has been developed because the multistory buildings were constructed and the various structural materials were used. To stop the spread of the fire and minimize the damage of human life and properties of the facility, it should be informed precisely to all the members of the facility. (author). 12 refs., 28 figs.

  19. Simulation of the TREAT-Upgrade Automatic Reactor Control System

    International Nuclear Information System (INIS)

    This paper describes the design of the Automatic Reactor Control System (ARCS) for the Transient Reactor Test Facility (TREAT) Upgrade. A simulation was used to facilitate the ARCS design and to completely test and verify its operation before installation at the TREAT facility

  20. Liquid scintillation counting system with automatic gain correction

    International Nuclear Information System (INIS)

    An automatic liquid scintillation counting apparatus is described including a scintillating medium in the elevator ram of the sample changing apparatus. An appropriate source of radiation, which may be the external source for standardizing samples, produces reference scintillations in the scintillating medium which may be used for correction of the gain of the counting system

  1. Automatic surveillance system using fish-eye lens camera

    Institute of Scientific and Technical Information of China (English)

    Xue Yuan; Yongduan Song; Xueye Wei

    2011-01-01

    This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates.Human regions are detected from the fish-eye image effectively and are corrected for perspective versions.An experiment is performed on indoor video sequences with different illumination and crowded conditions,with results demonstrating the efficiency of our algorithm.%@@ This letter presents an automatic surveillance system using fish-eye lens camera. Our system achieves wide-area automatic surveillance without a dead angle using only one camera. We propose a new human detection method to select the most adaptive classifier based on the locations of the human candidates. Human regions are detected from the fish-eye image effectively and are corrected for perspective versions. An experiment is performed on indoor video sequences with different illumination and crowded conditions, with results demonstrating the efficiency of our algorithm.

  2. Building an Image-Based System to automatically Score psoriasis

    DEFF Research Database (Denmark)

    G{'o}mez, D. Delgado; Carstensen, Jens Michael; Ersbøll, Bjarne Kjær

    2003-01-01

    images. The system is tested on patients with the dermatological disease psoriasis. Temporal series of images are taken for each patient and the lesions are automatically extracted. Results indicate that to the images obtained are a good source for obtaining derived variables to track the lesion....

  3. Evaluation of automatic exposure control systems in computed tomography

    International Nuclear Information System (INIS)

    The development of the computed tomography (CT) technology has brought wider possibilities on diagnostic medicine. It is a non-invasive method to see the human body in details. As the CT application increases, it raises the concern about patient dose, because the higher dose levels imparted compared to other diagnostic imaging modalities. The radiology community (radiologists, medical physicists and manufacturer) are working together to find the lowest dose level possible, without compromising the diagnostic image quality. The greatest and relatively new advance to lower the patient dose is the automatic exposure control (AEC) systems in CT. These systems are designed to ponder the dose distribution along the patient scanning and between patients taking into account their sizes and irradiated tissue densities. Based on the CT scanning geometry, the AEC-systems are very complex and their functioning is yet not fully understood. This work aims to evaluate the clinical performance of AEC-systems and their susceptibilities to assist on possible patient dose optimizations. The approach to evaluate the AEC-systems of three of the leading CT manufacturers in Brazil, General Electric, Philips and Toshiba, was the extraction of tube current modulation data from the DICOM standard image sequences, measurement and analysis of the image noise of those image sequences and measurement of the dose distribution along the scan length on the surface and inside of two different phantoms configurations. The tube current modulation of each CT scanner associated to the resulted image quality provides the performance of the AECsystem. The dose distribution measurements provide the dose profile due to the tube current modulation. Dose measurements with the AEC-system ON and OFF were made to quantify the impact of these systems regarding patient dose. The results attained give rise to optimizations on the AEC-systems applications and, by consequence, decreases the patient dose without

  4. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  5. Automatic Registration of Multi-Source Data Using Mutual Information

    Science.gov (United States)

    Parmehr, E. G.; Zhang, C.; Fraser, C. S.

    2012-07-01

    Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI) as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI) approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM) and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  6. AUTOMATIC REGISTRATION OF MULTI-SOURCE DATA USING MUTUAL INFORMATION

    Directory of Open Access Journals (Sweden)

    E. G. Parmehr

    2012-07-01

    Full Text Available Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  7. Requirements to a Norwegian National Automatic Gamma Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Lauritzen, B.; Hedemann Jensen, P.; Nielsen, F

    2005-04-01

    An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of the system in terms of ambient dose equivalent rate increments above the natural background levels. The study is based upon simplified deterministic calculations of the radiological consequences of generic nuclear accident scenarios. The density of gamma monitoring stations has been estimated from an analysis of the dispersion of radioactive materials over large distances using historical weather data; the minimum density is estimated from the requirement that a radioactive plume may not slip unnoticed in between stations of the monitoring network. The sensitivity of the gamma monitoring system is obtained from the condition that events that may require protective intervention measures should be detected by the system. Action levels for possible introduction of sheltering and precautionary foodstuff restrictions are derived in terms of ambient dose equivalent rate. For emergency situations where particulates contribute with only a small fraction of the total ambient dose equivalent rate from the plume, it is concluded that measurements of dose rate are sufficient to determine the need for sheltering; simple dose rate measurements however, are inadequate to determine the need for foodstuff restrictions and spectral measurements are required. (au)

  8. Requirements to a Norwegian National Automatic Gamma Monitoring System

    International Nuclear Information System (INIS)

    An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of the system in terms of ambient dose equivalent rate increments above the natural background levels. The study is based upon simplified deterministic calculations of the radiological consequences of generic nuclear accident scenarios. The density of gamma monitoring stations has been estimated from an analysis of the dispersion of radioactive materials over large distances using historical weather data; the minimum density is estimated from the requirement that a radioactive plume may not slip unnoticed in between stations of the monitoring network. The sensitivity of the gamma monitoring system is obtained from the condition that events that may require protective intervention measures should be detected by the system. Action levels for possible introduction of sheltering and precautionary foodstuff restrictions are derived in terms of ambient dose equivalent rate. For emergency situations where particulates contribute with only a small fraction of the total ambient dose equivalent rate from the plume, it is concluded that measurements of dose rate are sufficient to determine the need for sheltering; simple dose rate measurements however, are inadequate to determine the need for foodstuff restrictions and spectral measurements are required. (au)

  9. Automatic Discharge system from dewatering bin; Sekitantaki boira niokeru dassuiso karano kurinka haraidashi shisutemu no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Iwasaki, Atsushi; Kawakami, Masamichi; Ito, Takayoshi [Chugoku Electric Powers, Co., Inc., Hiroshima (Japan); Kinoshita, Tetsuhiro; Enomoto, Masayuki [Kawasaki Heavy Industries, Ltd., Hyogo (Japan)

    1999-03-15

    At present, discharge of clinker ash from dewatering bins is done by an operator near the mesh conveyor, based on past experience, as the operator observes the discharged ash condition on the mesh conveyor. We have precious data relating to automatic operation with the sensor signal (current of the mesh conveyor motor, moisture of the clinker ash, image processing data, open ratio of the ash discharge gate, etc.). We studied the relation between the clinker ash condition and actual operation. Using the data, we were able to construct the [Automatic discharge system from dewatering bin]. (author)

  10. Intelligent automatic overtaking system using vision for vehicle detection

    OpenAIRE

    Milanés Montero, Vicente; Fernández Llorca, David; Villagra Serrano, Jorge; Pérez, Joshué; Fernández López, Carlos; Parra Alonso, Ignacio; González Fernández-Vallejo, Carlos; Sotelo, Miguel Ángel

    2012-01-01

    There is clear evidence that investment in intelligent transportation system technologies brings major social and economic benefits. Technological advances in the area of automatic systems in particular are becoming vital for the reduction of road deaths. We here describe our approach to automation of one the riskiest autonomous manœuvres involving vehicles – overtaking. The approach is based on a stereo vision system responsible for detecting any preceding vehicle and triggering the autonomo...

  11. AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA

    Directory of Open Access Journals (Sweden)

    Veena G.S

    2013-12-01

    Full Text Available The proposed work aims to create a smart application camera, with the intention of eliminating the need for a human presence to detect any unwanted sinister activities, such as theft in this case. Spread among the campus, are certain valuable biometric identification systems at arbitrary locations. The application monitosr these systems (hereafter referred to as “object” using our smart camera system based on an OpenCV platform. By using OpenCV Haar Training, employing the Viola-Jones algorithm implementation in OpenCV, we teach the machine to identify the object in environmental conditions. An added feature of face recognition is based on Principal Component Analysis (PCA to generate Eigen Faces and the test images are verified by using distance based algorithm against the eigenfaces, like Euclidean distance algorithm or Mahalanobis Algorithm. If the object is misplaced, or an unauthorized user is in the extreme vicinity of the object, an alarm signal is raised.

  12. Study on traffic accidents mechanism with automatic recording systems. Part 2. Application of data from ADR and DMR for practical driver education; Jidosha kiroku sochi ni yoru kotsu jiko hassei mechanism no kenkyu. 2. Jiko data kirokukei (ADR) to unko kirokukei (DMR) no untensha kyoiku eno katsuyo

    Energy Technology Data Exchange (ETDEWEB)

    Ueyama, M.; Ogawa, S. [National Research Inst. of Police Science, Tokyo (Japan); Chikasue, H.; Muramatsu, K. [Yazaki Meter Co. Ltd., Tokyo (Japan)

    1997-10-01

    A field trial are carried out using automatic receding system; ADR (Accident Data Recorder) and DMR (Driving Monitoring Recorder) installed on 20 commercial vehicles, in order to assess the implications for driver behavior and accidents. The data suggest that the accident mechanism can be explained in terms of situation-specific factor and behavior of drivers just before accident that is, their attitude to the handing and control of vehicles. The data might offer a new information for practical driver education. 3 refs., 9 figs., 1 tab.

  13. Towards the development of Hyperspectral Images of trench walls. Robotrench: Automatic Data acquisition

    Science.gov (United States)

    Ragona, D. E.; Minster, B.; Rockwell, T. K.; Fialko, Y.; Bloom, R. G.; Hemlinger, M.

    2004-12-01

    Previous studies on imaging spectrometry of paleoseismological excavations (Ragona, et. al, 2003, 2004) showed that low resolution Hyperspectral Imagery of a trench wall, processed with a supervised classification algorithm, provided more stratigraphic information than a high-resolution digital photography of the same exposure. Although the low-resolution images depicted the most important variations, a higher resolution hyperspectral image is necessary to assist in the recognition and documentation of paleoseismic events. Because of the fact that our spectroradiometer can only acquire one pixel at the time, creating a 25 psi image of a 1 x 1 m area of a trench wall will require 40000 individual measurements. To ease this extensive task we designed and built a device that can automatically position the spectroradiometer probe along the x-z plane of a trench wall. This device, informally named Robotrench, has two 7 feet long axes of motion (horizontal and vertical) commanded by a stepper motor controller board and a laptop computer. A platform provides the set up for the spectroradiometer probe and for the calibrated illumination system. A small circuit provided the interface between the Robotrench motion and the spectroradiomenter data collection. At its best, Robotrench ?spectroradiometer symbiotic pair can automatically record 1500-2000 pixels/hour, making the image acquisition process slow but feasible. At the time this abstract submission only a small calibration experiment was completed. This experiment was designed to calibrate the X-Z axes and to test the instrument performance. We measured a 20 x 10 cm brick wall at a 25 psi resolution. Three reference marks were set up on the trench wall as control points for the image registration process. The experiment was conducted at night under artificial light (stabilized 2 x 50 W halogen lamps). The data obtained was processed with the Spectral Angle Mapper algorithm. The image recovered from the data showed an

  14. Automatic Erection System for Antenna Masts

    Science.gov (United States)

    Dotson, R. D.; Jacquemin, G. G.

    1985-01-01

    A telescoping mast does not require the payout of guy wires under tension. Erection system is built into stack of telescoping mast elements and is thereby protected from the weather. Concept is based on a telescoping tube mast, it is also applicable to an open truss with only minor modifications.

  15. Automatic speed management systems : great safety potential ?

    NARCIS (Netherlands)

    Oei, H.-l.

    1992-01-01

    An account is given of speed management experiments carried out in The Netherlands on four 2-lane rural roads with a speed limit of 80 km/h. The experiment involved an information campaign, warning signs and a radar camera system. Fixed signs advised a speed of between 60 and 80 km/h and an automati

  16. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas.

    Science.gov (United States)

    Chang, Hsien-Tsung; Chang, Yi-Ming; Tsai, Meng-Tze

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning. PMID:26839529

  17. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas

    Directory of Open Access Journals (Sweden)

    Hsien-Tsung Chang

    2016-01-01

    Full Text Available Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning.

  18. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  19. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  20. The analysis and expansion of regulatory binding site data in a wide range of bacteria through the use of a semi-automatic system - RegTransBase

    OpenAIRE

    Cipriano, Michael J.

    2014-01-01

    RegTransBase, a database describing regulatory interactions in prokaryotes, has been developed as a component of the MicrobesOnline/RegTransBase framework successfully used for interpretation of microbial stress response and metal reduction pathways. It is manually curated and based on published scientific literature. RegTransBase describes a large number of regulatory interactions and contains experimental data which investigates regulation with known elements. It is available at http://reg...

  1. Automatic Lameness Detection in a Milking Robot : Instrumentation, measurement software, algorithms for data analysis and a neural network model

    OpenAIRE

    Pastell, Matti

    2007-01-01

    The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feedi...

  2. Component fragilities. Data collection, analysis and interpretation

    International Nuclear Information System (INIS)

    As part of the component fragility research program sponsored by the US NRC, BNL is involved in establishing seismic fragility levels for various nuclear power plant equipment with emphasis on electrical equipment. To date, BNL has reviewed approximately seventy test reports to collect fragility or high level test data for switchgears, motor control centers and similar electrical cabinets, valve actuators and numerous electrical and control devices, e.g., switches, transmitters, potentiometers, indicators, relays, etc., of various manufacturers and models. BNL has also obtained test data from EPRI/ANCO. Analysis of the collected data reveals that fragility levels can best be described by a group of curves corresponding to various failure modes. The lower bound curve indicates the initiation of malfunctioning or structural damage, whereas the upper bound curve corresponds to overall failure of the equipment based on known failure modes occurring separately or interactively. For some components, the upper and lower bound fragility levels are observed to vary appreciably depending upon the manufacturers and models. For some devices, testing even at the shake table vibration limit does not exhibit any failure. Failure of a relay is observed to be a frequent cause of failure of an electrical panel or a system. An extensive amount of additional fregility or high level test data exists

  3. AUTOMATIC CONTROL SYSTEM FOR TORQUE NATIONAL STANDARD

    OpenAIRE

    J. Galván-Mancilla; Torres-Guzmán, J C

    2004-01-01

    The continuous development of the technology and the increase of its complexity demand wider measurementintervals, a greater exactness and a greater diversity of the standards used in order to establish the units ormeasuring systems. Torque metrology is of great importance and a magnitude is of common use for industry,technical development and research. The realization, quantification and dissemination of this magnitude are tasksassigned to the Metrology National Center (CENAM) Torque Laborat...

  4. Data collection system of greenhouse corps based on micro automated guided vehicle%基于微型自动导引运输车的盆栽作物数据采集系统

    Institute of Scientific and Technical Information of China (English)

    王立舒; 丁晓成; 时启凡

    2014-01-01

    orientation unit, was used to automatic navigate and pinpoint the location of samples. S3C6410 chip was use as the core processor of the control unit in micro AGV, S3C6410 is common RSIC processor developed by Samsung Company based on ARM1176JZF-S core and 16/32, which met the data processing requirements. ASLONG GA20Y180 micro direct current motor was used as the drive of the action unit, and achieved control of the motor L293D-based control module. Optical guided navigation was used to the guiding unit, which achieved reliable navigation through two micro AGV navigation modules. By RFID and optical recognition two kinds of ways, the orientation unit achieved targeting and accurate positioning of the Micro AGV during movement. The VDAS, made up of data acquisition units of image and environment as well as data processing unit, was used to collect data of samples’ images, environmental humidity and temperature, carbon dioxide intensity, illumination intensity, and then to process and store the collected data. The communication and control system, made up of vehicle communication unit, and control software on remote control computer, was used to realize long distance transmission and control. When collecting the sample’s data, the control software sent orders and the micro AGV carrying VDAS began to collect images and environmental parameters according to the planned routine. In order to validate the accuracy and stability of the DCS, taking soybean pot as sample in this paper, experiments on image and environmental data acquisition was done. It turned out that the images obtained from the DSC were evenly in good quality which met the requirements of image processing in the later period. Besides, the errors between the automatically collected environmental data and manual data were at around 2%, which met the precision standards of data acquisition. The DCS operated stably during the experiments and phenomenon of out of routine didn't occur. The error of orientation was

  5. Automatic Outdoor Monitoring System for Photovoltaic Panels

    Energy Technology Data Exchange (ETDEWEB)

    Stefancich, Marco; Simpson, Lin; Chiesa, Matteo

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  6. Automatic outdoor monitoring system for photovoltaic panels

    Science.gov (United States)

    Stefancich, Marco; Simpson, Lin; Chiesa, Matteo

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  7. A Computerized Data-Base System for Land-Use and Land-Cover Data Collected at Ground-Water Sampling Sites in the Pilot National Water Quality Assessment Program

    Science.gov (United States)

    Scott, Jonathon C.

    1989-01-01

    Data-base software has been developed for the management of land-use and land-cover data collected by the U.S. Geological Survey as part of a pilot program to test and refine concepts for a National Water-Quality Assessment Program. This report describes the purpose, use, and design of the land-use and land-cover data-base software. The software provides capabilities for interactive storage and retrieval of land-use and land-cover data collected at ground-water sampling sites. Users of the software can add, update, and delete land-use and land-cover data. The software also provides capabilities to group, print, and summarize the data. The land-use and land-cover data-base software supports multiple data-base systems so that data can be accessed by persons in different offices. Data-base systems are organized in a tiered structure. Each data-base system contains all the data stored in the data-base systems located in the lower tiers of the structure. Data can be readily transmitted from lower tiers to high tiers of the structure. Therefore, the data-base system at the highest tier of the structure contains land-use and land-cover data for the entire pilot program.

  8. 基于大数据分析的铁路自动售检票监控系统研究%Railway Automatic Ticketing and Gate Monitoring System based on big data analysis

    Institute of Scientific and Technical Information of China (English)

    王成; 史天运

    2015-01-01

    This article proposed the general frame of Railway Automatic Ticketing and Gate Monitoring System(RATGS). The System was consisted of 4 layers, which were the infrastructure layer, the management layer, the analysis layer and application layer. The System was introduced technologies such as multidimensional data analysis, the distributed ifle system storage MapReduce, Complex Event Processing(CEP), data mining and etc., to implement the value added services based on passenger behavior analysis, such as fault early warning, analysis of failure rate, the utilization rate analysis of equipments, business optimization analysis, OD hotspot analysis, abnormal passenger recognition, usability analysis of equipment. All of these pointed out a new method for the future development of RATGS.%本文提出铁路自动售检票监控系统总体框架由基础层、管理层、分析层和应用层组成。利用多维数据分析、分布式文件系统存储和MapReduce计算、复杂事件处理、数据挖掘等技术,实现对铁路自动售检票系统的故障预警和故障率分析、设备利用率分析、业务优化分析以及OD热点分析、异常旅客识别、设备易用性分析等以旅客行为分析为基础的增值业务,为铁路自动售检票系统的未来发展提供一种新思路。

  9. NW-MILO Acoustic Data Collection

    Energy Technology Data Exchange (ETDEWEB)

    Matzner, Shari; Myers, Joshua R.; Maxwell, Adam R.; Jones, Mark E.

    2010-02-17

    There is an enduring requirement to improve our ability to detect potential threats and discriminate these from the legitimate commercial and recreational activity ongoing in the nearshore/littoral portion of the maritime domain. The Northwest Maritime Information and Littoral Operations (NW-MILO) Program at PNNL’s Coastal Security Institute in Sequim, Washington is establishing a methodology to detect and classify these threats - in part through developing a better understanding of acoustic signatures in a near-shore environment. The purpose of the acoustic data collection described here is to investigate the acoustic signatures of small vessels. The data is being recorded continuously, 24 hours a day, along with radar track data and imagery. The recording began in August 2008, and to date the data contains tens of thousands of signals from small vessels recorded in a variety of environmental conditions. The quantity and variety of this data collection, with the supporting imagery and radar track data, makes it particularly useful for the development of robust acoustic signature models and advanced algorithms for signal classification and information extraction. The underwater acoustic sensing system is part of a multi-modal sensing system that is operating near the mouth of Sequim Bay. Sequim Bay opens onto the Straight of Juan de Fuca, which contains part of the border between the U.S. and Canada. Table 1 lists the specific components used for the NW-MILO system. The acoustic sensor is a hydrophone permanently deployed at a mean depth of about 3 meters. In addition to a hydrophone, the other sensors in the system are a marine radar, an electro-optical (EO) camera and an infra-red (IR) camera. The radar is integrated with a vessel tracking system (VTS) that provides position, speed and heading information. The data from all the sensors is recorded and saved to a central server. The data has been validated in terms of its usability for characterizing the

  10. Robust parameter design for automatically controlled systems and nanostructure synthesis

    Science.gov (United States)

    Dasgupta, Tirthankar

    2007-12-01

    This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor

  11. An automatic measuring method and system using a light curtain for the thread profile of a ballscrew

    International Nuclear Information System (INIS)

    An automatic non-contact measuring system for the thread profile of a ballscrew was developed and integrated using a light curtain, a high-accuracy linear encoder and a motion platform for the measuring stand. Firstly, data points from the thread profile of a ballscrew were collected by the measuring system and data partitioning was performed. Then, the proposed method was used to calculate the most important geometric errors of the thread profile, such as effective diameter, thread pitch, ball track runout and ball track cross-section error. Finally, the proposed system and method was verified for its good performance with acceptable accuracy and reliability. Compared with original measuring methods, the non-contact optical measuring system was capable of measuring most common features of the ballscrew with required accuracy, and was not limited to size and length of the ballscrew

  12. Collecting battery data with Open Battery

    OpenAIRE

    Jones, Gareth L.; Harrison, Peter G.

    2012-01-01

    In this paper we present Open Battery, a tool for collecting data on mobile phone battery usage, describe the data we have collected so far and make some observations. We then introduce the fluid queue model which we hope may prove a useful tool in future work to describe mobile phone battery traces.

  13. The Transformed Civil Rights Data Collection (CRDC)

    Science.gov (United States)

    Office for Civil Rights, US Department of Education, 2012

    2012-01-01

    Since 1968, the Civil Rights Data Collection (CRDC) has collected data on key education and civil rights issues in our nation's public schools for use by the Department of Education's Office for Civil Rights (OCR), other Department offices, other federal agencies, and by policymakers and researchers outside of the Department. The CRDC has…

  14. 34 CFR 303.176 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.176 Section 303.176 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... Data collection. Each application must include procedures that meet the requirements in §...

  15. 24 CFR 901.100 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Data collection. 901.100 Section 901.100 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued... PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.100 Data collection. (a) Information on some of...

  16. 5 CFR 890.1307 - Data collection.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Data collection. 890.1307 Section 890.1307 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS... Program Demonstration Project § 890.1307 Data collection. Each carrier will compile, maintain, and...

  17. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    Science.gov (United States)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  18. Automatic actinometric system for diffuse radiation measurement

    Science.gov (United States)

    Litwiniuk, Agnieszka; Zajkowski, Maciej

    2015-09-01

    Actinometric station is using for measuring solar of radiation. The results are helpful in determining the optimal position of solar panels relative to the Sun, especially in today's world, when the energy coming from the Sun and other alternative sources of energy become more and more popular. Polish climate does not provide as much energy as in countries in southern Europe, but it is possible to increase the amount of energy produced by appropriate arrangement of photovoltaic panels. There is the possibility of forecasting the amount of produced energy, the cost-effectiveness and profitability of photovoltaic installations. This implies considerable development opportunities for domestic photovoltaic power plants. This article presents description of actinometric system for diffuse radiation measurement, which is equipped with pyranometer - thermopile temperature sensor, amplifier AD620, AD Converter ADS1110, microcontroller Atmega 16, SD card, GPS module and LCD screen.

  19. Automatic system for driving probes of electron cyclotron

    International Nuclear Information System (INIS)

    The automatic system for driving six probes used on electron model of the ring cyclotron is described. This system allows one to move probes one by one or simultaneously. The active forcing of the process of switching on of the current in phase windings is used a driving scheme of step-motors. The shift of probes from one radius to other can be carried out both from the front panel of driving device (autonomous regime), and from the computer

  20. An automatic system for measuring road and tunnel lighting performance

    OpenAIRE

    Greffier, Florian; Charbonnier, Pierre; Tarel, Jean-Philippe; Boucher, Vincent; FOURNELA, Fabrice

    2015-01-01

    Various problems in different domains are related to the operation of the Human Visual System (HVS). This is notably the case when considering the driver's visual perception, and road safety in general. That is why several standards of road equipments are directly derived from human visual abilities and especially in road and tunnel lighting installations design. This paper introduces an automatic system for measuring road and tunnel lighting performance. The proposed device is based on an em...

  1. The anemodata 1-IIE. Automatic system for wind data acquisition; El anemodata 1-IIE. Sistema automatico para la adquisicion de datos de viento

    Energy Technology Data Exchange (ETDEWEB)

    Borja, Marco Antonio; Parkman Cuellar, Pablo A. [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)

    1986-12-31

    Wind is an inexhaustible energy source. To study its behavior in order to develop research projects and apply new technologies connected to its maximum development is one of the activities carried on at the Instituto de Investigaciones Electricas (IIE). As a part of such activities, the equipment Anemodata-1-IIE was designed and built for the wind velocity and direction data acquisition. The Anemodata-1-IIE is the result of the work that the Departamento de Fuentes no Convencionales (Non-Conventional Energy Sources of the Energy Sources Department) carries on regarding the development of electric equipment for the anemometry. [Espanol] Una fuente inagotable de energia es el viento. Estudiar su comportamiento para desarrollar proyectos de investigacion y aplicar nuevas tecnologias vinculadas con su maximo aprovechamiento es una de las actividades que se realizan en el Instituto de Investigaciones Electricas (IIE). Como parte de dichas actividades, se diseno y construyo el equipo Anemodata-1-IIE para la adquisicion de datos de velocidad y direccion del viento. El anemodata-1-IIE es un resultado de los trabajos que el Departamento de Fuentes no Convencionales, de la division de Fuentes de Energia, lleva a cabo en torno al desarrollo de equipo electrico para anemometria.

  2. Automatic counterfeit protection system code classification

    Science.gov (United States)

    Van Beusekom, Joost; Schreyer, Marco; Breuel, Thomas M.

    2010-01-01

    Wide availability of cheap high-quality printing techniques make document forgery an easy task that can easily be done by most people using standard computer and printing hardware. To prevent the use of color laser printers or color copiers for counterfeiting e.g. money or other valuable documents, many of these machines print Counterfeit Protection System (CPS) codes on the page. These small yellow dots encode information about the specific printer and allow the questioned document examiner in cooperation with the manufacturers to track down the printer that was used to generate the document. However, the access to the methods to decode the tracking dots pattern is restricted. The exact decoding of a tracking pattern is often not necessary, as tracking the pattern down to the printer class may be enough. In this paper we present a method that detects what CPS pattern class was used in a given document. This can be used to specify the printer class that the document was printed on. Evaluation proved an accuracy of up to 91%.

  3. SeaBuoySoft – an On-line Automated Windows based Ocean Wave height Data Acquisition and Analysis System for Coastal Field’s Data Collection

    Directory of Open Access Journals (Sweden)

    P.H.Tarudkar

    2014-12-01

    Full Text Available Measurement of various hydraulic parameters such as wave heights for the research and the practical purpose in the coastal fields is one of the critical and challenging but equally important criteria in the field of ocean engineering for the design and the development of hydraulic structures such as construction of sea walls, break waters, oil jetties, fisheries harbors, all other structures, and the ships maneuvering, embankments, berthing on jetties. This paper elucidates the development of “SeaBuoySoft online software system for coastal field‟s wave height data collection” for the coastal application work. The system could be installed along with the associated hardware such as a Digital Waverider Receiver unit and a Waverider Buoy at the shore. The ocean wave height data, transmitted by wave rider buoy installed in the shallow/offshore waters of sea is received by the digital waverider receiver unit and it is interfaced to the SeaBuoySoft software. The design and development of the software system has been worked out in-house at Central Water and Power Research Station, Pune, India. The software has been developed as a Windows based standalone version and is unique of its kind for the reception of real time ocean wave height data, it takes care of its local storage of wave height data for its further analysis work as and when required. The system acquires real time ocean wave height data round the clock requiring no operator intervention during data acquisition process on site.

  4. Automatic "pipeline" analysis of 3-D MRI data for clinical trials: application to multiple sclerosis.

    Science.gov (United States)

    Zijdenbos, Alex P; Forghani, Reza; Evans, Alan C

    2002-10-01

    The quantitative analysis of magnetic resonance imaging (MRI) data has become increasingly important in both research and clinical studies aiming at human brain development, function, and pathology. Inevitably, the role of quantitative image analysis in the evaluation of drug therapy will increase, driven in part by requirements imposed by regulatory agencies. However, the prohibitive length of time involved and the significant intraand inter-rater variability of the measurements obtained from manual analysis of large MRI databases represent major obstacles to the wider application of quantitative MRI analysis. We have developed a fully automatic "pipeline" image analysis framework and have successfully applied it to a number of large-scale, multicenter studies (more than 1,000 MRI scans). This pipeline system is based on robust image processing algorithms, executed in a parallel, distributed fashion. This paper describes the application of this system to the automatic quantification of multiple sclerosis lesion load in MRI, in the context of a phase III clinical trial. The pipeline results were evaluated through an extensive validation study, revealing that the obtained lesion measurements are statistically indistinguishable from those obtained by trained human observers. Given that intra- and inter-rater measurement variability is eliminated by automatic analysis, this system enhances the ability to detect small treatment effects not readily detectable through conventional analysis techniques. While useful for clinical trial analysis in multiple sclerosis, this system holds widespread potential for applications in other neurological disorders, as well as for the study of neurobiology in general. PMID:12585710

  5. Proceedings of the workshop on reliability data collection

    International Nuclear Information System (INIS)

    The main purpose of the Workshop was to provide a forum for exchanging information and experience on Reliability Data Collection and analysis to support Living Probabilistic Safety Assessments (LPSA). The Workshop is divided into four sessions which titles are: Session 1: Reliability Data - Database Systems (3 papers), Session 2: Reliability Data Collection for PSA (5 papers), Session 3: NPP Data Collection (3 papers), Session 4: Reliability Data Assessment (Part 1: General - 2 papers; Part 2: CCF - 2 papers; Part 3: Reactor Protection Systems / External Event Data - 2 papers; Part 4: Human Errors - 2 papers)

  6. From Automatic to Adaptive Data Acquisition:- towards scientific sensornets

    OpenAIRE

    Chang, Marcus

    2009-01-01

    Sensornets have been used for ecological monitoring the past decade, yetthe main driving force behind these deployments are still computer scien-tists. The denser sampling and added modalities oered by sensornets coulddrive these elds in new directions, but not until the domain scientists be-come familiar with sensornets and use them as any other instrument in theirtoolbox.We explore three dierent directions in which sensornets can become easierto deploy, collect data of higher quality, and o...

  7. Automatic registration of multi-source medium resolution satellite data

    OpenAIRE

    L. Barazzetti; M. Gianinetto; M. Scaioni

    2014-01-01

    Multi-temporal and multi-source images gathered from satellite platforms are nowadays a fundamental source of information in several domains. One of the main challenges in the fusion of different data sets consists in the registration issue, i.e., the integration into the same framework of images collected with different spatial resolution and acquisition geometry. This paper presents a novel methodology to accomplish this task on the basis of a method that stands out from existing a...

  8. The use of the Global Positioning System for real-time data collecting during ecological aerial surveys in the Kruger National Park, South Africa

    Directory of Open Access Journals (Sweden)

    P.C. Viljoen

    1994-09-01

    Full Text Available The use of the Global Positioning System (GPS for real-time data collecting during ecological aerial surveys (EAS in the Kruger National Park (KNP was investigated as an alternative to post-survey manual data capture. Results obtained during an aerial census of large herbivores and surface water distribution in the northern part of the KNP using an onboard GPS connected to a palmtop computer are discussed. This relatively inexpensive system proved to be highly efficient for real-time data capture while additional information such as ground velocity and time can be recorded for every data point. Measures of distances between a ground marker and fix points measured during a flight (x = 60.0 m are considered to be well within the requirements of the EAS.

  9. A centralised remote data collection system using automated traps for managing and controlling the population of the Mediterranean (Ceratitis capitata) and olive (Dacus oleae) fruit flies

    Science.gov (United States)

    Philimis, Panayiotis; Psimolophitis, Elias; Hadjiyiannis, Stavros; Giusti, Alessandro; Perelló, Josep; Serrat, Albert; Avila, Pedro

    2013-08-01

    The present paper describes the development of a novel monitoring system (e-FlyWatch system) for managing and controlling the population of two of the world's most destructive fruit pests, namely the olive fruit fly (Bactrocera oleae, Rossi - formerly Dacus oleae) and the Mediterranean fruit fly (Ceratitis capitata, also called medfly). The novel monitoring system consists of a) novel automated traps with optical and motion detection modules for capturing the flies, b) local stations including a GSM/GPRS module, sensors, flash memory, battery, antenna etc. and c) a central station that collects, stores and publishes the results (i.e. insect population in each field, sensor data, possible error/alarm data) via a web-based management software.The centralised data collection system provides also analysis and prediction models, end-user warning modules and historical analysis of infested areas. The e-FlyWatch system enables the SMEs-producers in the Fruit, Vegetable and Olive sectors to improve their production reduce the amount of insecticides/pesticides used and consequently the labour cost for spraying activities, and the labour cost for traps inspection.

  10. ACRF Data Collection and Processing Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Macduff, M; Egan, D

    2004-12-01

    We present a description of the data flow from measurement to long-term archive. We also discuss data communications infrastructure. The data handling processes presented include collection, transfer, ingest, quality control, creation of Value-Added Products (VAP), and data archiving.

  11. Learning Semantic Concepts from Noisy Media Collection for Automatic Image Annotation

    Institute of Scientific and Technical Information of China (English)

    TIAN Feng; SHEN Xukun

    2015-01-01

    — Along with the explosive growth of im-ages, automatic image annotation has attracted great in-terest of various research communities. However, despite the great progress achieved in the past two decades, au-tomatic annotation is still an important open problem in computer vision, and can hardly achieve satisfactory per-formance in real-world environment. In this paper, we ad-dress the problem of annotation when noise is interfering with the dataset. A semantic neighborhood learning model on noisy media collection is proposed. Missing labels are replenished, and semantic balanced neighborhood is con-struct. The model allows the integration of multiple la-bel metric learning and local nonnegative sparse coding. We construct semantic consistent neighborhood for each sample, thus corresponding neighbors have higher global similarity, partial correlation, conceptual similarity along with semantic balance. Meanwhile, an iterative denoising method is also proposed. The method proposed makes a marked improvement as compared to the current state-of-the-art.

  12. TASK OF FUNCTIONING ALGORITHM SYNTHESIS OF AUTOMATIC MANAGEMENT SYSTEM OF TRAFFIC MOVEMENT IN METROPOLITAN AREAS

    OpenAIRE

    Getsovich, E.; N. Semchenko; V. Korol

    2009-01-01

    The concept and structure of automatic management system of traffic movement in metropolitan areas is proposed by the authors, where the parameters of road network and efficient information on traffic streams are used as primary data as well as empirical-stochastic approach to simulation and forecast of development of situation.

  13. TASK OF FUNCTIONING ALGORITHM SYNTHESIS OF AUTOMATIC MANAGEMENT SYSTEM OF TRAFFIC MOVEMENT IN METROPOLITAN AREAS

    Directory of Open Access Journals (Sweden)

    E. Getsovich

    2009-01-01

    Full Text Available The concept and structure of automatic management system of traffic movement in metropolitan areas is proposed by the authors, where the parameters of road network and efficient information on traffic streams are used as primary data as well as empirical-stochastic approach to simulation and forecast of development of situation.

  14. Automatic Creation of Structural Models from Point Cloud Data: the Case of Masonry Structures

    Science.gov (United States)

    Riveiro, B.; Conde-Carnero, B.; González-Jorge, H.; Arias, P.; Caamaño, J. C.

    2015-08-01

    One of the fields where 3D modelling has an important role is in the application of such 3D models to structural engineering purposes. The literature shows an intense activity on the conversion of 3D point cloud data to detailed structural models, which has special relevance in masonry structures where geometry plays a key role. In the work presented in this paper, color data (from Intensity attribute) is used to automatically segment masonry structures with the aim of isolating masonry blocks and defining interfaces in an automatic manner using a 2.5D approach. An algorithm for the automatic processing of laser scanning data based on an improved marker-controlled watershed segmentation was proposed and successful results were found. Geometric accuracy and resolution of point cloud are constrained by the scanning instruments, giving accuracy levels reaching a few millimetres in the case of static instruments and few centimetres in the case of mobile systems. In any case, the algorithm is not significantly sensitive to low quality images because acceptable segmentation results were found in cases where blocks could not be visually segmented.

  15. Automatic Vehicle Speed Reduction System Using Rf Technology

    Directory of Open Access Journals (Sweden)

    Deepa B Chavan

    2014-04-01

    Full Text Available For vehicle safety and safety for passengers in vehicle is an important parameter. Most of the vehicles get accident because no proper safety measures are taken especially at curves and hair pin bends humps and any obstacles in front of the vehicle. This system can be used for the prevention of such a problem by indicating a pre indication and also reducing the speed of vehicles by reducing the fuel rate of vehicle. As the action is in terms of fuel rate so the vehicle automatically goes to control and avoids the accidents. At curves and hair pin bends the line of sight is not possible for the drivers so the special kind of transmitter which is tuned at a frequency of 433MHZ are mounted as these transmitters continuously radiate a RF signal for some particular area. As the vehicle come within this radiation the receiver in the vehicle gets activate. The transmitter used here is a coded transmitter which is encoded with encoder. The encoder provides a 4 bit binary data which is serially transmitted to transmitter. The transmitter used here is ASK type (amplitude shift keying which emits the RF radiation.

  16. Development of an automatic characterisation system for silicon detectors

    CERN Document Server

    Hacker, J; Krammer, M; Wedenig, R

    2002-01-01

    The CMS experiment will be equipped with the largest silicon tracker in the world. The tracker will consist of about 25,000 silicon sensors which will cover an area of more than 200 m sup 2. Four quality test centres will carry out various checks on a representative sample of sensors to assure a homogeneous quality throughout the 2((1)/(2)) years of production. One of these centres is based in Vienna. To cope with the large number of sensors a fast and fully automatic characterisation system has been realised. We developed the software in LabView and built a cost-efficient probe station in house by assembling individual components and commercial instruments. Both the global properties of a sensor and the characteristic quantities of the individual strips can be measured. The measured data are immediately analysed and sent to a central database. The mechanical and electrical set-up will be explained and results from CMS prototype sensors are presented.

  17. Collective flow in small systems

    International Nuclear Information System (INIS)

    The large density of matter in the interaction region of the proton–nucleus or deuteron–nucleus collisions enables the collective expansion of the fireball. Predictions of a hydrodynamic model for the asymmetric transverse flow are presented and compared to experimental data

  18. Waste collection systems for recyclables

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Merrild, Hanna Kristina; Møller, Jacob;

    2010-01-01

    Recycling of paper and glass from household waste is an integrated part of waste management in Denmark, however, increased recycling is a legislative target. The questions are: how much more can the recycling rate be increased through improvements of collection schemes when organisational and...... technical limitations are respected, and what will the environmental and economic consequences be? This was investigated in a case study of a municipal waste management system. Five scenarios with alternative collection systems for recyclables (paper, glass, metal and plastic packaging) were assessed by...... and treatment of waste were reduced with increasing recycling, mainly because the high cost for incineration was avoided. However, solutions for mitigation of air pollution caused by increased collection and transport should be sought. (C) 2009 Elsevier Ltd. All rights reserved....

  19. An automatic control system for a power-generating unit

    Energy Technology Data Exchange (ETDEWEB)

    Itelman, U.R.; Mankin, M.N.; Mikhailova, I.V.

    1979-02-05

    There exists an automatic control system for a power-generating unit, which contains a load regulator for the turbine, which is connected to the output of the actuator valve servo motor together with the slide valve of the regulator measuring channel, a boiler productivity regulator and a frequency-compensation unit for controlling the input power; the output from this unit is connected to the input to the turbine load regulator and the boiler productivity regulator. In this automatic control system, the compensation unit is manufactured in the form of a frequency deviation sensor connected to the voltage transformer of the generator--it is a complex electronic and conversion component. In order to simplify this design of the compensation unit, it is manufactured as a motion sensor, which is mechanically connected to the slide valve. This connection is made through the slide box of the valve or through the valve position rod.

  20. Intelligent E-Learning Systems: Automatic Construction of Ontologies

    Science.gov (United States)

    Peso, Jesús del; de Arriaga, Fernando

    2008-05-01

    During the last years a new generation of Intelligent E-Learning Systems (ILS) has emerged with enhanced functionality due, mainly, to influences from Distributed Artificial Intelligence, to the use of cognitive modelling, to the extensive use of the Internet, and to new educational ideas such as the student-centered education and Knowledge Management. The automatic construction of ontologies provides means of automatically updating the knowledge bases of their respective ILS, and of increasing their interoperability and communication among them, sharing the same ontology. The paper presents a new approach, able to produce ontologies from a small number of documents such as those obtained from the Internet, without the assistance of large corpora, by using simple syntactic rules and some semantic information. The method is independent of the natural language used. The use of a multi-agent system increases the flexibility and capability of the method. Although the method can be easily improved, the results so far obtained, are promising.

  1. Collective dynamics of multicellular systems

    Indian Academy of Sciences (India)

    R Maithreye; C Suguna; Somdatta Sinha

    2011-11-01

    We have studied the collective behaviour of a one-dimensional ring of cells for conditions when the individual uncoupled cells show stable, bistable and oscillatory dynamics. We show that the global dynamics of this model multicellular system depends on the system size, coupling strength and the intrinsic dynamics of the cells. The intrinsic variability in dynamics of the constituent cells are suppressed to stable dynamics, or modified to intermittency under different conditions. This simple model study reveals that cell–cell communication, system size and intrinsic cellular dynamics can lead to evolution of collective dynamics in structured multicellular biological systems that is significantly different from its constituent single-cell behaviour.

  2. Automatic extraction of highway light poles and towers from mobile LiDAR data

    Science.gov (United States)

    Yan, Wai Yeung; Morsy, Salem; Shaker, Ahmed; Tulloch, Mark

    2016-03-01

    Mobile LiDAR has been recently demonstrated as a viable technique for pole-like object detection and classification. Despite that a desirable accuracy (around 80%) has been reported in the existing studies, majority of them were presented in the street level with relatively flat ground and very few of them addressed how to extract the entire pole structure from the ground or curb surface. Therefore, this paper attempts to fill the research gap by presenting a workflow for automatic extraction of light poles and towers from mobile LiDAR data point cloud, with a particular focus on municipal highway. The data processing workflow includes (1) an automatic ground filtering mechanism to separate aboveground and ground features, (2) an unsupervised clustering algorithm to cluster the aboveground data point cloud, (3) a set of decision rules to identify and classify potential light poles and towers, and (4) a least-squares circle fitting algorithm to fit the circular pole structure so as to remove the ground points. The workflow was tested with a set of mobile LiDAR data collected for a section of highway 401 located in Toronto, Ontario, Canada. The results showed that the proposed method can achieve an over 91% of detection rate for five types of light poles and towers along the study area.

  3. A Micro-Kernel Test Engine for Automatic Test System

    OpenAIRE

    Shuai Wang; Yindong Ji; Shiyuan Yang

    2011-01-01

    In traditional automatic test solutions, a test engine usually encompasses all functions in its kernel, including compiling test program, generating test event chain, scheduling test process and executing test events. This makes the engine tightly coupled with test language and the system under test, so that it is difficult to maintain, optimize and extend the test engine. In order to solve these problems, a micro-kernel test engine is designed and implemented based on the service oriented ar...

  4. The Diagnostic System of A – 604 Automatic Transmission

    OpenAIRE

    Czaban Jaroslaw; Szpica Dariusz

    2014-01-01

    Automatic gearbox gains increasing popularity in Europe. Little interest in diagnosis of such type of transmission in Poland results from the fact of small share in the whole market of operated cars, so there is a lack of availability of special diagnostic devices. These factors cause issues of expensive repairs, often involving a replacement of subassembly to new or aftermarket one. To a small extent some prophylactic diagnostic tests are conducted, which can eliminate future gearbox system ...

  5. EOS Data Dumper: an Automatic Downloading and Re-Distributing System for Free EOS Data%EOS Data Dumper——EOS免费数据自动下载与重发布系统

    Institute of Scientific and Technical Information of China (English)

    南卓铜; 王亮绪; 李新

    2007-01-01

    为了更有效的利用已有数据资源, 不造成科研设施的重复投资, 数据共享越来越受到重视. NASA对地观测系统(EOS)提供了大量的包括MODIS在内的免费数据资源, 为此, EOS Data Dumper(EDD)通过程序模拟EOS数据门户的正常下载流程, 采用了先进的Web页面文本信息捕捉技术, 实现定时自动下载研究区的全部EOS免费数据, 并通过免费的DIAL系统, 向互联网重新发布, 实现复杂的基于时空的数据查询. 从技术角度详细介绍了EDD的项目背景与意义、实现方案、涉及的关键技术等.

  6. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  7. FULLY AUTOMATIC IMAGE-BASED REGISTRATION OF UNORGANIZED TLS DATA

    Directory of Open Access Journals (Sweden)

    M. Weinmann

    2012-09-01

    Full Text Available The estimation of the transformation parameters between different point clouds is still a crucial task as it is usually followed by scene reconstruction, object detection or object recognition. Therefore, the estimates should be as accurate as possible. Recent developments show that it is feasible to utilize both the measured range information and the reflectance information sampled as image, as 2D imagery provides additional information. In this paper, an image-based registration approach for TLS data is presented which consists of two major steps. In the first step, the order of the scans is calculated by checking the similarity of the respective reflectance images via the total number of SIFT correspondences between them. Subsequently, in the second step, for each SIFT correspondence the respective SIFT features are filtered with respect to their reliability concerning the range information and projected to 3D space. Combining the 3D points with 2D observations on a virtual plane yields 3D-to-2D correspondences from which the coarse transformation parameters can be estimated via a RANSAC-based registration scheme including the EPnP algorithm. After this coarse registration, the 3D points are again checked for consistency by using constraints based on the 3D distance, and, finally, the remaining 3D points are used for an ICP-based fine registration. Thus, the proposed methodology provides a fast, reliable, accurate and fully automatic image-based approach for the registration of unorganized point clouds without the need of a priori information about the order of the scans, the presence of regular surfaces or human interaction.

  8. Observer Manual and Current Data Collection Forms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Observer Program web page that lists the observer field manual and all current data collection forms that observers are required to take out to sea.

  9. System for automatically preventing the raising of ash from dedicated landfills

    Directory of Open Access Journals (Sweden)

    Milutinović Predrag

    2012-01-01

    Full Text Available The paper presents a system for automatic prevention of raising of ashes from dedicated landfills based on a simple mathematical model which has modest entry requirements for meteorological data. Such an approach is efficient enough and enables fast information retrieval, i.e. zones with different concentrations of dust in the air, enabling quick start of counter measures to reduce emissions of ashes into the air. The system hardware consists of an automatic weather station, set of meters that determine moisture of ash, set of remotely managed sprinklers, computers, microprocessor and microcontroller based elements for the local acquisition and management of the executive elements and modules for wireless data transfer. An original software application for the system management has been developed. Within the application there is a module that allows entering of all data necessary to configure the system, as well as data about sensors and sprinklers. Based on the meteorological input data, measured moisture content of the ashes, and on the basis of determined functional dependencies, special software module operates sprinklers for soaking the surfaces from which the ashes is emitted into the air, in order to eliminate these emissions. The system, based on the developed mathematical model, predicts the propagation of ashes through the air, as well as dry and wet deposition, in real-time. The system automatically stores all the data relevant to the future analyses and reporting. The system is designed and implemented as modular and open. A custom developed graphical user interface serves as Man-Machine Interface (MMI. By using the TCP/IP connection it could be easily connected with the other information systems. [Projekat Ministarstva nauke Republike Srbije, br. TR-15005: Design and development of pilot system for preventing automatic raise of ashes from dedicated dumps

  10. The automatic Hydrological Information System of the Jucar Basin as a warning system

    International Nuclear Information System (INIS)

    The original aim of the project was to create a tool to prevent and reduce flood damage.Inside the legal normative in Spain for Civil Protection, the Automatic Hydrological Information I System (in Spanish SAIH) plays an extremely important role in the protocols to make the warnings and communicate the different warning situations to the Civil Protection Forces and Local Administrations. The SAIH of the Jucar Basin generates every five minutes about 3000 variables occupying a daily storage volume of 5 MB. The data are stored in sequential files to form the historic database. The operation team checks continuously the different control variables, supervises rainfall data and the levels of reservoirs, rivers and channels. In case of under estimation of the critical values established previously, the operators inform the responsible technical staff of the Jucar Basin Water Authority as well as the Weather Service and Civil Protection Forces, following the defined operation rules. Exists an automatic system of alarms generation that checks all the variables every five minutes and initiates the different protocols in the different warning levels by itself. During the periods of high hydrologic risks, especially from September to November, the Basin Control Centre has operation personal 24 hours a day in order to increment the surveillance in the period of major risk of heavy rainfall and flood events. The operation personal of the system maintains a permanent contact with the responsible dam engineers of the Basin Water Authority as well as with the Civil Protection Forces. Throughout 15 years of history, operating the automatic data acquisition system it is possible to find many examples that show its utility and benefits in the detection and evaluation of flood situations and to inform by time the authorities in charge of protection and evacuation measures, like the Civil Protection Forces and local administrations.(Author)

  11. Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data

    Science.gov (United States)

    Hong, J. H.; Su, Y. T.

    2016-06-01

    With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  12. Automatic high power RF processing system using PLC

    International Nuclear Information System (INIS)

    We have developed the automatic control system using Programmable Logic Controller (PLC) for the high power RF processing, which is used for the C-band (5712-MHz) accelerating structure and the klystron in SPring-8 Compact SASE Source (SCSS) project. The PLC has been used in industry to have many advantages, such as reliable, compact, low-cost. In addition the PLC is recently able to communicate with the upper-layer controller through a network. We use this system for the klystron RF power test. In this paper, we will describe the configuration of the system and the detail of the high power RF processing. (author)

  13. Automatic technological control system of the Kolsk NPP Unit-1

    International Nuclear Information System (INIS)

    Reconstruction of the present centralized control system with application of a new small SM2-9 computer and reactor control system ''Jailyk'' is started at the Kolsk NPP (KNPP). Presented are the flowsheet of the technological process automatic control system (TRACS) of the KNPP first generation after reconstruction, by stage diagram conducting organizational-technical measures on the TP ACS reconstruction and communication flowsheet of the IV-500 MA information subsystem with the SM2-9 computer. The TP ACS reconstruction will make it possible to obtain the unit power up to 115% from the nominal one

  14. Application of MintDrive Automatic Precision Positioning System

    Institute of Scientific and Technical Information of China (English)

    Wu Fengming; Yang Yonggang; Zhao Xiaolong; Zhang Zhiyuan

    2004-01-01

    It is very important to locate batteries accurately and quickly during automatic battery production.Unstable or inaccurate location will negatively influence battery's consistency, quality and finished product rate.A traditional way is using sensor to detect and locate batteries directly , but because of the detecting tolerance, setting them on a fixed point exactly is almost impossible.This problem could be completely solved by the application of mint drive automatic accurate servo locating system.Firstly operating software WorkBench test was applied to collocate the servo locating driver for a most optimized control.Then based on the requirement of real location, program and test the locating action with a programming software and finally upload all the locating information to MicroLogix 1200 PLC, the PLC will control the running on each station telling when to locate, where is the location and how to eliminate bad parts.For this intelligent servo locating system has the advantages of powerful function, simple operation, high controlling and locating accuracy and easy maintenance, it is very suitable to be adopted by automatic battery making line.It is regarded as a very advanced method of control currently for reducing waste material due to inaccurate location and tough adjustment.

  15. Automatic graphene transfer system for improved material quality and efficiency

    Science.gov (United States)

    Boscá, Alberto; Pedrós, Jorge; Martínez, Javier; Palacios, Tomás; Calle, Fernando

    2016-02-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The process is based on the all-fluidic manipulation of the graphene to avoid mechanical damage, strain and contamination, and on the combination of capillary action and electrostatic repulsion between the graphene and its container to ensure a centered sample on top of the target substrate. The improved carrier mobility and yield of the automatically transferred graphene, as compared to that manually transferred, is demonstrated by the optical and electrical characterization of field-effect transistors fabricated on both materials. In particular, 70% higher mobility values, with a 30% decrease in the unintentional doping and a 10% strain reduction are achieved. The system has been developed for lab-scale transfer and proved to be scalable for industrial applications.

  16. Computer-based automatic finger- and speech-tracking system.

    Science.gov (United States)

    Breidegard, Björn

    2007-11-01

    This article presents the first technology ever for online registration and interactive and automatic analysis of finger movements during tactile reading (Braille and tactile pictures). Interactive software has been developed for registration (with two cameras and a microphone), MPEG-2 video compression and storage on disk or DVD as well as an interactive analysis program to aid human analysis. An automatic finger-tracking system has been implemented which also semiautomatically tracks the reading aloud speech on the syllable level. This set of tools opens the way for large scale studies of blind people reading Braille or tactile images. It has been tested in a pilot project involving congenitally blind subjects reading texts and pictures. PMID:18183897

  17. Collection of arc welding process data

    OpenAIRE

    K. Luksa; Z. Rymarski

    2006-01-01

    Purpose: The aim of the research was to examine the possibility of detecting welding imperfections by recording the instant values of welding parameters. The microprocessor controlled system for real-time collection and display of welding parameters was designed, implemented and tested.Design/methodology/approach: The system records up to 4 digital or analog signals collected from welding process and displays their run on the LCD display. To disturb the welding process artificial disturbances...

  18. Effect of an automatic feeding system on growth performance and feeding behaviour of pigs reared outdoors

    Directory of Open Access Journals (Sweden)

    Riccardo Fortina

    2010-01-01

    Full Text Available Nine Mora Romagnola and 10 Large White x Mora Romagnola growing pigs were reared outdoors. In both groups ad libitum feed was provided. Conventional pigs received it twice a day, distributed in two long troughs. Inside the corral of the second group, an automatic station was set up for: feed distribution, pigs weighing, and control by an analog camera. Thus the self-feeders received feed ad libitum individually by the automatic system, divided into small quantities at meal times. During the experiment the analog camera was used over 24 hours each day, to collect pictures of pigs in order to investigate their behaviours. For each picture the day and hour, the number of visible pigs and their behaviours were recorded and a statistical analysis of data, which was expressed as hourly frequencies of behavioural elements, was performed. Moreover to highlight “active” and “passive” behaviours between the groups, two categories “Move” and “Rest” were created grouping some behavioural elements. With regard to performance, conventional pigs reached a higher total weight gain (56.1±2.42 kg vs 46.7±2.42 kg; P=0.0117. But the feed conversion index (FCI of both groups was similar. The self-feeders had consumed less feed than conventional animals. The feeding system seems to influence behaviours. The percentage of time spent in Eating activity differs (P<0.0001 between the self-fed (median 24.6% and conventional pigs (median 10.9%. The resulting more regular eating trend of self-feeders influenced the daily activities distribution. The behavioural category Rest (median: self-feeders 55.0% vs 71.4% conventional pigs was dominant, with conventional pigs becoming more restless, particularly at meal times. This type of feeding competition and aggressive behaviour did not happen in the self-feeders due to the feed distribution system. The self-feeder results showed that pigs eat at the automatic station both day and night. The animals perform on

  19. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database and or...

  20. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.