WorldWideScience

Sample records for automatic data collection systems

  1. Automatic Weissenberg data collection system for time-resolved protein crystallography

    CERN Document Server

    Sakabe, N; Higashi, T; Igarashi, N; Suzuki, M; Watanabe, N; Sasaki, K

    2001-01-01

    A totally new type of fully automatic Weissenberg data-collection system called 'Galaxy' was developed and was installed at the Photon Factory. This automatic data collection system consists of a rotated-inclined focusing monochromator, a screenless Weissenberg type camera, an image reader, an eraser, a cassette transportation mechanism, a control console and a safety and high-speed computer network system linking a control console, data processing computers and data servers. The special characteristics of this system are a Weissenberg camera with a fully cylindrical cassette which can be rotated to exchange a frame, a maximum number of 36 images to be recorded in an IP cassette, and a very high speed IP reader with five reading heads. Since the frame exchange time is only a few seconds, this system is applicable for time-resolved protein crystallography at seconds or minutes of time-scale.

  2. The automatic collection and treatment of data for DNC

    International Nuclear Information System (INIS)

    Song Quanxun

    1991-01-01

    The automatic data collection and treatment for DNC (Delayed Neutron Counting) with S-85 MCA (Multi-Channel Analyzers) and PDP-11/34 computer is described. The principle and function of the soft-ware package are introduced in detail

  3. AUTORED - the JADE automatic data reduction system

    International Nuclear Information System (INIS)

    Whittaker, J.B.

    1984-07-01

    The design and implementation of and experience with an automatic data processing system for the reduction of data from the JADE experiment at DESY is described. The central elements are a database and a job submitter which combine powerfully to minimise the need for manual intervention. (author)

  4. MAC, A System for Automatically IPR Identification, Collection and Distribution

    Science.gov (United States)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  5. Can Automatic Classification Help to Increase Accuracy in Data Collection?

    Directory of Open Access Journals (Sweden)

    Frederique Lang

    2016-09-01

    Full Text Available Purpose: The authors aim at testing the performance of a set of machine learning algorithms that could improve the process of data cleaning when building datasets. Design/methodology/approach: The paper is centered on cleaning datasets gathered from publishers and online resources by the use of specific keywords. In this case, we analyzed data from the Web of Science. The accuracy of various forms of automatic classification was tested here in comparison with manual coding in order to determine their usefulness for data collection and cleaning. We assessed the performance of seven supervised classification algorithms (Support Vector Machine (SVM, Scaled Linear Discriminant Analysis, Lasso and elastic-net regularized generalized linear models, Maximum Entropy, Regression Tree, Boosting, and Random Forest and analyzed two properties: accuracy and recall. We assessed not only each algorithm individually, but also their combinations through a voting scheme. We also tested the performance of these algorithms with different sizes of training data. When assessing the performance of different combinations, we used an indicator of coverage to account for the agreement and disagreement on classification between algorithms. Findings: We found that the performance of the algorithms used vary with the size of the sample for training. However, for the classification exercise in this paper the best performing algorithms were SVM and Boosting. The combination of these two algorithms achieved a high agreement on coverage and was highly accurate. This combination performs well with a small training dataset (10%, which may reduce the manual work needed for classification tasks. Research limitations: The dataset gathered has significantly more records related to the topic of interest compared to unrelated topics. This may affect the performance of some algorithms, especially in their identification of unrelated papers. Practical implications: Although the

  6. automatic data collection design for neural networks detection

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. Automated data collection is necessary to alleviate problems inherent in data collection for investigation of management frauds. Once we have gathered a realistic data, several methods then exist for proper analysis and detection of anomalous transactions. However, in Nigeria, collecting fraudulent data is ...

  7. Automatic Data Collection Design for Neural Networks Detection of ...

    African Journals Online (AJOL)

    Automated data collection is necessary to alleviate problems inherent in data collection for investigation of management frauds. Once we have gathered a realistic data, several methods then exist for proper analysis and detection of anomalous transactions. However, in Nigeria, collecting fraudulent data is relatively difficult ...

  8. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data.

    Science.gov (United States)

    2012-12-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  9. Current position on software for the automatic data acquisition system

    International Nuclear Information System (INIS)

    1988-01-01

    This report describes the current concepts for software to control the operation of the Automatic Data Acquisition System (ADAS) proposed for the Deaf Smith County, Texas, Exploratory Shaft Facility (ESF). The purpose of this report is to provide conceptual details of how the ADAS software will execute the data acquisition function, and how the software will make collected information available to the test personnel, the Data Management Group (DMG), and other authorized users. It is not intended that this report describe all of the ADAS functions in exact detail, but the concepts included herein will form the basis for the formal ADAS functional requirements definition document. 5 refs., 14 figs

  10. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  11. Fully automatic characterization and data collection from crystals of biological macromolecules

    International Nuclear Information System (INIS)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W.

    2015-01-01

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention

  12. ATCOM: Automatically Tuned Collective Communication System for SMP Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Meng-Shiou [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    Conventional implementations of collective communications are based on point-to-point communications, and their optimizations have been focused on efficiency of those communication algorithms. However, point-to-point communications are not the optimal choice for modern computing clusters of SMPs due to their two-level communication structure. In recent years, a few research efforts have investigated efficient collective communications for SMP clusters. This dissertation is focused on platform-independent algorithms and implementations in this area. There are two main approaches to implementing efficient collective communications for clusters of SMPs: using shared memory operations for intra-node communications, and overlapping inter-node/intra-node communications. The former fully utilizes the hardware based shared memory of an SMP, and the latter takes advantage of the inherent hierarchy of the communications within a cluster of SMPs. Previous studies focused on clusters of SMP from certain vendors. However, the previously proposed methods are not portable to other systems. Because the performance optimization issue is very complicated and the developing process is very time consuming, it is highly desired to have self-tuning, platform-independent implementations. As proven in this dissertation, such an implementation can significantly out-perform the other point-to-point based portable implementations and some platform-specific implementations. The dissertation describes in detail the architecture of the platform-independent implementation. There are four system components: shared memory-based collective communications, overlapping mechanisms for inter-node and intra-node communications, a prediction-based tuning module and a micro-benchmark based tuning module. Each component is carefully designed with the goal of automatic tuning in mind.

  13. Data system for automatic flux mapping applications

    International Nuclear Information System (INIS)

    Oates, R.M.; Neuner, J.A.; Couch, R.D. Jr.; Kasinoff, A.M.

    1982-01-01

    This patent discloses interface circuitry for coupling the data from neutron flux detectors in a reactor core to microprocessors. This circuitry minimizes the microprocessor time required to accept data and provides a technique for measuring variable frequency data from the in-core detectors within a minimum amount of hardware and with crystal-controlled accuracy. A frequency link is employed to transmit data with good isolation, and the information is measured using a programmable timer

  14. Automatic Traffic Data Collection under Varying Lighting and Temperature Conditions in Multimodal Environments: Thermal versus Visible Spectrum Video-Based Systems

    Directory of Open Access Journals (Sweden)

    Ting Fu

    2017-01-01

    Full Text Available Vision-based monitoring systems using visible spectrum (regular video cameras can complement or substitute conventional sensors and provide rich positional and classification data. Although new camera technologies, including thermal video sensors, may improve the performance of digital video-based sensors, their performance under various conditions has rarely been evaluated at multimodal facilities. The purpose of this research is to integrate existing computer vision methods for automated data collection and evaluate the detection, classification, and speed measurement performance of thermal video sensors under varying lighting and temperature conditions. Thermal and regular video data was collected simultaneously under different conditions across multiple sites. Although the regular video sensor narrowly outperformed the thermal sensor during daytime, the performance of the thermal sensor is significantly better for low visibility and shadow conditions, particularly for pedestrians and cyclists. Retraining the algorithm on thermal data yielded an improvement in the global accuracy of 48%. Thermal speed measurements were consistently more accurate than for the regular video at daytime and nighttime. Thermal video is insensitive to lighting interference and pavement temperature, solves issues associated with visible light cameras for traffic data collection, and offers other benefits such as privacy, insensitivity to glare, storage space, and lower processing requirements.

  15. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    CERN Document Server

    Blume, H; Bourenkov, G P; Kosciesza, D; Bartunik, H D

    2001-01-01

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics.

  16. Milk-flow data collected routinely in an automatic milking system: an alternative to milking-time testing in the management of teat-end condition?

    Science.gov (United States)

    Nørstebø, Håvard; Rachah, Amira; Dalen, Gunnar; Rønningen, Odd; Whist, Anne Cathrine; Reksen, Olav

    2018-01-11

    Having a poor teat-end condition is associated with increased mastitis risk, hence avoiding milking machine settings that have a negative effect on teat-end condition is important for successful dairy production. Milking-time testing (MTT) can be used in the evaluation of vacuum conditions during milking, but the method is less suited for herds using automatic milking systems (AMS) and relationships with teat end condition is poorly described. This study aimed to increase knowledge on interpretation of MTT in AMS and to assess whether milk-flow data obtained routinely by an AMS can be useful for the management of teat-end health. A cross-sectional study, including 251 teats of 79 Norwegian Red cows milked by AMS was performed in the research herd of the Norwegian University of Life Sciences. The following MTT variables were obtained at teat level: Average vacuum level in the short milk tube during main milking (MTVAC), average vacuum in the mouthpiece chamber during main milking and overmilking, teat compression intensity (COMPR) and overmilking time. Average and peak milk flow rates were obtained at quarter level from the AMS software. Teat-end callosity thickness and roughness was registered, and teat dimensions; length, and width at apex and base, were measured. Interrelationships among variables obtained by MTT, quarter milk flow variables, and teat dimensions were described. Associations between these variables and teat-end callosity thickness and roughness, were investigated. Principal component analysis showed clusters of strongly related variables. There was a strong negative relationship between MTVAC and average milk flow rate. The variables MTVAC, COMPR and average and peak milk flow rate were associated with both thickness and roughness of the callosity ring. Quarter milk flow rate obtained directly from the AMS software was useful in assessing associations between milking machine function and teat-end condition; low average milk flow rates were

  17. A geological and geophysical data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    Sudhakar, T.; Afzulpurkar, S.

    A geological and geophysical data collection system using a Personal Computer is described below. The system stores data obtained from various survey systems typically installed in a charter vessel and can be used for similar applications on any...

  18. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    Albrecht, R.W.; Crowe, R.D.; McGuire, J.J.

    1978-01-01

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  19. Expert systems for crash data collection

    Science.gov (United States)

    1999-02-01

    The goal of the Federal Highway Administration (FHWA) Expert Systems for Crash Data Collection Program was to use expert system technology to improve the accuracy and consistency of police-reported data. The program included the development and evalu...

  20. Type-assisted automatic garbage collection for lock-free data structures

    OpenAIRE

    Yang, Albert Mingkun; Wrigstad, Tobias

    2017-01-01

    We introduce Isolde, an automatic garbage collection scheme designed specifically for managing memory in lock-free data structures, such as stacks, lists, maps and queues. Isolde exists as a plug-in memory manager, designed to sit on-top of another memory manager, and use it's allocator and reclaimer (if exists). Isolde treats a lock-free data structure as a logical heap, isolated from the rest of the program. This allows garbage collection outside of Isolde to take place without affecting th...

  1. Estimating Train Choices of Rail Transit Passengers with Real Timetable and Automatic Fare Collection Data

    Directory of Open Access Journals (Sweden)

    Wei Zhu

    2017-01-01

    Full Text Available An urban rail transit (URT system is operated according to relatively punctual schedule, which is one of the most important constraints for a URT passenger’s travel. Thus, it is the key to estimate passengers’ train choices based on which passenger route choices as well as flow distribution on the URT network can be deduced. In this paper we propose a methodology that can estimate individual passenger’s train choices with real timetable and automatic fare collection (AFC data. First, we formulate the addressed problem using Manski’s paradigm on modelling choice. Then, an integrated framework for estimating individual passenger’s train choices is developed through a data-driven approach. The approach links each passenger trip to the most feasible train itinerary. Initial case study on Shanghai metro shows that the proposed approach works well and can be further used for deducing other important operational indicators like route choices, passenger flows on section, load factor of train, and so forth.

  2. NLO error propagation exercise data collection system

    Energy Technology Data Exchange (ETDEWEB)

    Keisch, B.; Bieber, A.M. Jr.

    1983-01-01

    A combined automated and manual system for data collection is described. The system is suitable for collecting, storing, and retrieving data related to nuclear material control at a bulk processing facility. The system, which was applied to the NLO operated Feed Materials Production Center, was successfully demonstrated for a selected portion of the facility. The instrumentation consisted of off-the-shelf commercial equipment and provided timeliness, convenience, and efficiency in providing information for generating a material balance and performing error propagation on a sound statistical basis.

  3. NLO error propagation exercise data collection system

    International Nuclear Information System (INIS)

    Keisch, B.; Bieber, A.M. Jr.

    1983-01-01

    A combined automated and manual system for data collection is described. The system is suitable for collecting, storing, and retrieving data related to nuclear material control at a bulk processing facility. The system, which was applied to the NLO operated Feed Materials Production Center, was successfully demonstrated for a selected portion of the facility. The instrumentation consisted of off-the-shelf commercial equipment and provided timeliness, convenience, and efficiency in providing information for generating a material balance and performing error propagation on a sound statistical basis

  4. Hand held data collection and monitoring system for nuclear facilities

    Science.gov (United States)

    Brayton, D.D.; Scharold, P.G.; Thornton, M.W.; Marquez, D.L.

    1999-01-26

    Apparatus and method is disclosed for a data collection and monitoring system that utilizes a pen based hand held computer unit which has contained therein interaction software that allows the user to review maintenance procedures, collect data, compare data with historical trends and safety limits, and input new information at various collection sites. The system has a means to allow automatic transfer of the collected data to a main computer data base for further review, reporting, and distribution purposes and uploading updated collection and maintenance procedures. The hand held computer has a running to-do list so sample collection and other general tasks, such as housekeeping are automatically scheduled for timely completion. A done list helps users to keep track of all completed tasks. The built-in check list assures that work process will meet the applicable processes and procedures. Users can hand write comments or drawings with an electronic pen that allows the users to directly interface information on the screen. 15 figs.

  5. Multichannel display system with automatic sequential output of analog data

    International Nuclear Information System (INIS)

    Bykovskii, Yu.A.; Gruzinov, A.E.; Lagoda, V.B.

    1989-01-01

    The authors describe a device that, with maximum simplicity and autonomy, permits parallel data display from 16 measuring channels with automatic output to the screen of a storage oscilloscope in ∼ 50 μsec. The described device can be used to study the divergence characteristics of the ion component of plasma sources and in optical and x-ray spectroscopy of pulsed processes. Owing to its compactness and autonomy, the device can be located in the immediate vicinity of the detectors (for example, inside a vacuum chamber), which allows the number of vacuum electrical lead-ins and the induction level to be reduced

  6. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  7. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  8. System for Collecting Biosignal Data from Multiple Patient Monitoring Systems.

    Science.gov (United States)

    Yoon, Dukyong; Lee, Sukhoon; Kim, Tae Young; Ko, JeongGil; Chung, Wou Young; Park, Rae Woong

    2017-10-01

    Biosignal data include important physiological information. For that reason, many devices and systems have been developed, but there has not been enough consideration of how to collect and integrate raw data from multiple systems. To overcome this limitation, we have developed a system for collecting and integrating biosignal data from two patient monitoring systems. We developed an interface to extract biosignal data from Nihon Kohden and Philips monitoring systems. The Nihon Kohden system has a central server for the temporary storage of raw waveform data, which can be requested using the HL7 protocol. However, the Philips system used in our hospital cannot save raw waveform data. Therefore, our system was connected to monitoring devices using the RS232 protocol. After collection, the data were transformed and stored in a unified format. From September 2016 to August 2017, we collected approximately 117 patient-years of waveform data from 1,268 patients in 79 beds of five intensive care units. Because the two systems use the same data storage format, the application software could be run without compatibility issues. Our system collects biosignal data from different systems in a unified format. The data collected by the system can be used to develop algorithms or applications without the need to consider the source of the data.

  9. Operational performance of Swedish grid connected solar power plants. Automatic data collection; Driftuppfoeljning av svenska naetanslutna solcellsanlaeggningar. Automatisering av datainsamling

    Energy Technology Data Exchange (ETDEWEB)

    Hedstroem, Jonas; Svensson, Stefan

    2006-09-15

    A performance database containing all grid-connected PV-systems in Sweden has been in operation since March 2002. The systems in the database are described in detail and energy production is continuously added in the form of monthly values. The energy production and the system descriptions are published on www.elforsk.se/solenergi. In august 2006 31 active systems were present in the database. As result of the Swedish subsidy program this number is expected to increase to over 100 systems in the next years. The new owners of PV-systems are obliged to report the produced electricity to the authorities at least once a year. In this work we have studied different means to simplify the collection of data. Four different methods are defined. 1. The conversion of readings from energy meters made at arbitrary distance in time into monthly values. 2. Methods to handle data obtained with the monitoring systems provided by different inverter manufactures. 3. Methods to acquire data from PV-systems with energy meters reporting to the green certificate system. 4. Commercial GSM/GPRS monitoring systems. The first method is the minimum level required by the authorities. The second and third methods are the use of equipments that are expected to be used by some PV-systems for other reasons. Method 4 gives a possibility to create a fully automatic collection method. The described GPRS-systems are expected to have an initial cost of roughly 4000 SEK and a yearly fee of 200 SEK (1 SEK {approx} 0.14 USD)

  10. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    Bianchini, Ricardo M.; Estevez, Jorge; Vollmer, Alberto E.; Iglicki, Flora A.

    1999-01-01

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  12. Integrated system to automatize information collecting for the primary health care at home.

    Science.gov (United States)

    Oliveira, Edson N; Cainelli, Jean; Pinto, Maria Eugênia B; Cazella, Silvio C; Dahmer, Alessandra

    2013-01-01

    Data collected in a consistent manner is the basis for any decision making. This article presents a system that automates data collection by community-based health workers during their visits to the residences of users of the Brazilian Health Care System (Sistema Único de Saúde - SUS) The automated process will reduce the possibility of mistakes in the transcription of visit information and make information readily available to the Ministry of Health. Furthermore, the analysis of the information provided via this system can be useful in the implementation of health campaigns and in the control of outbreaks of epidemiological diseases.

  13. Low-cost automatic activity data recording system

    Directory of Open Access Journals (Sweden)

    Moraes M.F.D.

    1997-01-01

    Full Text Available We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT. Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds when the activity occurred. As a result, the computer builds up several files (one per detector/sensor containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.. Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

  14. Explosive detection systems data collection final report

    Science.gov (United States)

    2016-10-01

    following tasks over the life of the contract: - Data collection on a variety of HME from EDS equipment provided by DHS, via the Transportation ...acquisition by DHS, in conjunction with the Transportation Security Administration (TSA). DHS used this contract vehicle as a means of keeping this effort...GA; Cincinnati, OH Fire Dept Sep Provided complete kits for Lee County, FL Sheriff’s Dept; Houston, TX Police Dept; Las Vegas Metro Police Dept

  15. Evaluation of carbon emission reductions promoted by private driving restrictions based on automatic fare collection data in Beijing, China.

    Science.gov (United States)

    Zhang, Wandi; Chen, Feng; Wang, Zijia; Huang, Jianling; Wang, Bo

    2017-11-01

    Public transportation automatic fare collection (AFC) systems are able to continuously record large amounts of passenger travel information, providing massive, low-cost data for research on regulations pertaining to public transport. These data can be used not only to analyze characteristics of passengers' trips but also to evaluate transport policies that promote a travel mode shift and emission reduction. In this study, models combining card, survey, and geographic information systems (GIS) data are established with a research focus on the private driving restriction policies being implemented in an ever-increasing number of cities. The study aims to evaluate the impact of these policies on the travel mode shift, as well as relevant carbon emission reductions. The private driving restriction policy implemented in Beijing is taken as an example. The impact of the restriction policy on the travel mode shift from cars to subways is analyzed through a model based on metro AFC data. The routing paths of these passengers are also analyzed based on the GIS method and on survey data, while associated carbon emission reductions are estimated. The analysis method used in this study can provide reference for the application of big data in evaluating transport policies. Motor vehicles have become the most prevalent source of emissions and subsequently air pollution within Chinese cities. The evaluation of the effects of driving restriction policies on the travel mode shift and vehicle emissions will be useful for other cities in the future. Transport big data, playing an important support role in estimating the travel mode shift and emission reduction considered, can help related departments to estimate the effects of traffic jam alleviation and environment improvement before the implementation of these restriction policies and provide a reference for relevant decisions.

  16. Automatic Identification System (AIS) Collection and Reach-back System: System Description

    Science.gov (United States)

    2014-08-20

    to playback previously recorded NMEA data (. rcd files), and parcel the RMC and AIVDM sentences to the com 19 and com 40s port and IP addresses. 4.1.1...Input . rcd file – P2 := ID21 VDM UDP Port # 1 – P3 := ID21 VDM IP Address # 1 – P4 := ID21 RMC UDP Port # 2 – P5 := ID21 RMC IP Address # 2 – P6 := RAW...DEBUG MESSAGE SELECTION – P12 := Platform ID • Usage example: crbsplay pm050714. rcd ,2944,127.0.0.1,2944,127.0.0.1,4944,10.250.14.8,5944,10.250.14.8

  17. Automatic data acquisition system in CAMAC for spectrometry

    International Nuclear Information System (INIS)

    Szabo, L.; Szalay, S.; Takacz, P.; Pal, A.

    1981-01-01

    A special memory module with twofold access for spectrometric information is described. Via the direct access entry three regimes are realized: plus one, minus one and adress storage. The visual monitoring is carried out by means of an interface containing a buffer memory for the representation of 4 256-channel spectra and the text on the display. The system work is managed by a controller on the basis of a microprocessor

  18. Bluetooth data collection system for planning and arterial management.

    Science.gov (United States)

    2014-08-01

    This report presents the results of a research and development project of an implementable portable wireless traffic data collection system. Utilizing Bluetooth wireless technology as a platform, portable battery powered data collection units housed ...

  19. Automatic data acquisition system of environmental radiation monitor with a personal computer

    International Nuclear Information System (INIS)

    Ohkubo, Tohru; Nakamura, Takashi.

    1984-05-01

    The automatic data acquisition system of environmental radiation monitor was developed in a low price by using a PET personal computer. The count pulses from eight monitors settled at four site boundaries were transmitted to a radiation control room by a signal transmission device and analyzed by the computer via 12 channel scaler and PET-CAMAC Interface for graphic display and printing. (author)

  20. Graphic data output system for the automatic control systems of heavy ion accelerator

    International Nuclear Information System (INIS)

    Angelov, A.Kh.; Inkin, V.D.; Lysyakov, V.N.

    1978-01-01

    The principles and specific solutions underlying the task of expanding the automatic control system of a heavy ion accelerator (HIA ACS) are described. The HIA ACS structure is based on the possibility of switching a second TPA/1 computer into the system. The HIA ACS is realized in the form of two coupled systems with their functional specializations: system for data acquisition and processing (system A) and system for data displaying (system B). System B permits to considerably increase the degree of controllability of the entire ACS by an operator. Basing on the obtained information, the operator can preset a controlling effect into system A. A large set of external devices permits the operator to perform single measurements of the value of a selected parameter, set new values for the selected parameter, plot functional dependences like Z=F(x), Z=F(x,y). A raster-type alphabet-digital and memory display, a teletype and a plotter serve for visual image of information. An indicator-setting terminal is simultaneously an indicator and control unit

  1. Manual editing of automatically recorded data in an anesthesia information management system.

    Science.gov (United States)

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  2. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.

  3. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    International Nuclear Information System (INIS)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described

  4. Sensor Systems Collect Critical Aerodynamics Data

    Science.gov (United States)

    2010-01-01

    With the support of Small Business Innovation Research (SBIR) contracts with Dryden Flight Research Center, Tao of Systems Integration Inc. developed sensors and other components that will ultimately form a first-of-its-kind, closed-loop system for detecting, measuring, and controlling aerodynamic forces and moments in flight. The Hampton, Virginia-based company commercialized three of the four planned components, which provide sensing solutions for customers such as Boeing, General Electric, and BMW and are used for applications such as improving wind turbine operation and optimizing air flow from air conditioning systems. The completed system may one day enable flexible-wing aircraft with flight capabilities like those of birds.

  5. Ocean Floor Geomagnetic Data Collection System.

    Science.gov (United States)

    1982-12-01

    provide a tab that aids ia quick zenoval lu=-4ic7 reco very. 34 (3) ) j_ n he oupia g system altiiiz _d is the AMP Optimate Single Position Fioe.r Optic...inch thick PVZ plug with two 1O"-rings to aid in pressure sealing (?igure 2.11). The powsr leads from the batteries ara soldered to a pin assembly...and lowe~ad an addtiona’! amount to r-nabla attachmenit of the upper mast. The buoy is next lowered into the water and tended along sida unti6. the

  6. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  7. Interactive signal analysis and ultrasonic data collection system user's manual

    Science.gov (United States)

    Smith, G. R.

    1978-01-01

    The interactive signal analysis and ultrasonic data collection system (ECHO1) is a real time data acquisition and display system. ECHO1 executed on a PDP-11/45 computer under the RT11 real time operating system. Extensive operator interaction provided the requisite parameters to the data collection, calculation, and data modules. Data were acquired in real time from a pulse echo ultrasonic system using a Biomation Model 8100 transient recorder. The data consisted of 2084 intensity values representing the amplitude of pulses transmitted and received by the ultrasonic unit.

  8. Mobile In Vivo Infrared Data Collection and Diagnoses Comparison System

    Science.gov (United States)

    Mintz, Frederick W. (Inventor); Moynihan, Philip I. (Inventor); Gunapala, Sarath D. (Inventor)

    2013-01-01

    Described is a mobile in vivo infrared brain scan and analysis system. The system includes a data collection subsystem and a data analysis subsystem. The data collection subsystem is a helmet with a plurality of infrared (IR) thermometer probes. Each of the IR thermometer probes includes an IR photodetector capable of detecting IR radiation generated by evoked potentials within a user's skull. The helmet is formed to collect brain data that is reflective of firing neurons in a mobile subject and transmit the brain data to the data analysis subsystem. The data analysis subsystem is configured to generate and display a three-dimensional image that depicts a location of the firing neurons. The data analysis subsystem is also configured to compare the brain data against a library of brain data to detect an anomaly in the brain data, and notify a user of any detected anomaly in the brain data.

  9. Automatic decision support system based on SAR data for oil spill detection

    Science.gov (United States)

    Mera, David; Cotos, José M.; Varela-Pet, José; Rodríguez, Pablo G.; Caro, Andrés

    2014-11-01

    Global trade is mainly supported by maritime transport, which generates important pollution problems. Thus, effective surveillance and intervention means are necessary to ensure proper response to environmental emergencies. Synthetic Aperture Radar (SAR) has been established as a useful tool for detecting hydrocarbon spillages on the oceans surface. Several decision support systems have been based on this technology. This paper presents an automatic oil spill detection system based on SAR data which was developed on the basis of confirmed spillages and it was adapted to an important international shipping route off the Galician coast (northwest Iberian Peninsula). The system was supported by an adaptive segmentation process based on wind data as well as a shape oriented characterization algorithm. Moreover, two classifiers were developed and compared. Thus, image testing revealed up to 95.1% candidate labeling accuracy. Shared-memory parallel programming techniques were used to develop algorithms in order to improve above 25% of the system processing time.

  10. A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data

    Science.gov (United States)

    Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.

    2017-12-01

    Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now

  11. Volunteer-based distributed traffic data collection system

    DEFF Research Database (Denmark)

    Balachandran, Katheepan; Broberg, Jacob Honoré; Revsbech, Kasper

    2010-01-01

    An architecture for a traffic data collection system is proposed, which can collect data without having access to a backbone network. Contrary to other monitoring systems it relies on volunteers to install a program on their own computers, which will capture incoming and outgoing packets, group...... them into flows and send the flow data to a central server. Data can be used for studying and characterising internet traffic and for testing traffic models by regenerating real traffic. The architecture is designed to have efficient and light usage of resources on both client and server sides. Worst...

  12. To the problem of topological optimization of data processing and transmission networks in development of the automatic control system ''Atom''

    International Nuclear Information System (INIS)

    Gal'berg, V.P.

    1981-01-01

    Some optimization problems occurring in developing the automatic control system (ASC) of a commercial amalgamation (ACS-ATOM), assessments of economically optimal structure of location of computation centres and means of data transmission in particular are considered [ru

  13. OConGraX - Automatically Generating Data-Flow Test Cases for Fault-Tolerant Systems

    Science.gov (United States)

    Nunes, Paulo R. F.; Hanazumi, Simone; de Melo, Ana C. V.

    The more complex to develop and manage systems the more software design faults increase, making fault-tolerant systems highly required. To ensure their quality, the normal and exceptional behaviors must be tested and/or verified. Software testing is still a difficult and costly software development task and a reasonable amount of effort has been employed to develop techniques for testing programs’ normal behaviors. For the exceptional behavior, however, there is a lack of techniques and tools to effectively test it. To help in testing and analyzing fault-tolerant systems, we present in this paper a tool that provides an automatic generation of data-flow test cases for objects and exception-handling mechanisms of Java programs and data/control-flow graphs for program analysis.

  14. An airborne meteorological data collection system using satellite relay /ASDAR/

    Science.gov (United States)

    Bagwell, J. W.; Lindow, B. G.

    1978-01-01

    The paper describes the aircraft to satellite data relay (ASDAR) project which processes information collected by the navigation and data systems of widebody jet aircraft which cross data-sparse areas of the tropics and southern hemisphere. The ASDAR system consists of a data acquisition and control unit to acquire, store, and format latitude, longitude, altitude, wind speed, wind direction, and outside air temperature data; a transmitter to relay the formatted data via satellite to the ground; and a clock to time the data sampling and transmission periods.

  15. Automatic Detection and Recognition of Pig Wasting Diseases Using Sound Data in Audio Surveillance Systems

    Directory of Open Access Journals (Sweden)

    Yongwha Chung

    2013-09-01

    Full Text Available Automatic detection of pig wasting diseases is an important issue in the management of group-housed pigs. Further, respiratory diseases are one of the main causes of mortality among pigs and loss of productivity in intensive pig farming. In this study, we propose an efficient data mining solution for the detection and recognition of pig wasting diseases using sound data in audio surveillance systems. In this method, we extract the Mel Frequency Cepstrum Coefficients (MFCC from sound data with an automatic pig sound acquisition process, and use a hierarchical two-level structure: the Support Vector Data Description (SVDD and the Sparse Representation Classifier (SRC as an early anomaly detector and a respiratory disease classifier, respectively. Our experimental results show that this new method can be used to detect pig wasting diseases both economically (even a cheap microphone can be used and accurately (94% detection and 91% classification accuracy, either as a standalone solution or to complement known methods to obtain a more accurate solution.

  16. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  17. Implementing Provenance Collection in a Legacy Data Product Generation System

    Science.gov (United States)

    Conover, H.; Ramachandran, R.; Kulkarni, A.; Beaumont, B.; McEniry, M.; Graves, S. J.; Goodman, H.

    2012-12-01

    NASA has been collecting, storing, archiving and distributing vast amounts of Earth science data derived from satellite observations for several decades now. The raw data collected from the different sensors undergoes many different transformations before it is distributed to the science community as climate-research-quality data products. These data transformations include calibration, geolocation, and conversion of the instrument counts into meaningful geophysical parameters, and may include reprojection and/or spatial and temporal averaging as well. In the case of many Earth science data systems, the science algorithms and any ancillary data files used for these transformations are delivered as a "black box" to be integrated into the data system's processing framework. In contrast to an experimental workflow that may vary with each iteration, such systems use consistent, well-engineered processes to apply the same science algorithm to each well-defined set of inputs in order to create standard data products. Even so, variability is inevitably introduced. There may be changes made to the algorithms, different ancillary datasets may be used, underlying hardware and software may get upgraded, etc. Furthermore, late-arriving input data, operator error, or other processing anomalies may necessitate regeneration and replacement of a particular set of data files and any downstream products. These variations need to be captured, documented and made accessible to the scientific community so they can be properly accounted for in analyses. This presentation describes an approach to provenance capture, storage and dissemination implemented at the NASA Science Investigator-led Processing System (SIPS) for the AMSR-E (Advanced Microwave Scanning Radiometer - Earth Observing System) instrument. Key considerations in adding provenance capabilities to this legacy data system include: (1) granularity of provenance information captured, (2) additional context information needed

  18. A speech recognition system for data collection in precision agriculture

    Science.gov (United States)

    Dux, David Lee

    Agricultural producers have shown interest in collecting detailed, accurate, and meaningful field data through field scouting, but scouting is labor intensive. They use yield monitor attachments to collect weed and other field data while driving equipment. However, distractions from using a keyboard or buttons while driving can lead to driving errors or missed data points. At Purdue University, researchers have developed an ASR system to allow equipment operators to collect georeferenced data while keeping hands and eyes on the machine during harvesting and to ease georeferencing of data collected during scouting. A notebook computer retrieved locations from a GPS unit and displayed and stored data in Excel. A headset microphone with a single earphone collected spoken input while allowing the operator to hear outside sounds. One-, two-, or three-word commands activated appropriate VBA macros. Four speech recognition products were chosen based on hardware requirements and ability to add new terms. After training, speech recognition accuracy was 100% for Kurzweil VoicePlus and Verbex Listen for the 132 vocabulary words tested, during tests walking outdoors or driving an ATV. Scouting tests were performed by carrying the system in a backpack while walking in soybean fields. The system recorded a point or a series of points with each utterance. Boundaries of points showed problem areas in the field and single points marked rocks and field corners. Data were displayed as an Excel chart to show a real-time map as data were collected. The information was later displayed in a GIS over remote sensed field images. Field corners and areas of poor stand matched, with voice data explaining anomalies in the image. The system was tested during soybean harvest by using voice to locate weed patches. A harvester operator with little computer experience marked points by voice when the harvester entered and exited weed patches or areas with poor crop stand. The operator found the

  19. System for Anonymous Data Collection Based on Group Signature Scheme

    Directory of Open Access Journals (Sweden)

    David Troják

    2016-01-01

    Full Text Available This paper deals with an anonymous data collection in the Internet of Things (IoT. the privacy and anonymity of the data source is important for many IoT applications, such as in agriculture, health, and automotive. the proposed data‑collection system provides anonymity for the data sources by applying a cooperation group scheme. the group scheme also provides a low power consumption. the system is built upon the Tor (The Onion Router anonymous network, which is a part of the Internet darknet. the proposed system was designed for the Android devices on the client side and for Java environment on the server side. We evaluated the anonymous data collection in a real‑use scenario that covers selected data acquisition (e.g. signal strength from smartphones triggered by their geographical location change. the results show that the proposed system provides the sufficient data source anonymity, an effective revocation, a low computational cost and a low overhead.

  20. INfluence of vinasse on water movement in soil, using automatic acquisition and handling data system

    International Nuclear Information System (INIS)

    Nascimento Filho, V.F. do; Barros Ferraz, E.S. de

    1986-01-01

    The vinasse, by-product of ethylic alcohol industry from the sugar cane juice or molasses yeast fermentation, has been incorporated in the soil as fertilizer, due to the its hight organic matter (2-6%), potassium and sulphate (0,1-0,5%) and other nutrient contents. By employing monoenergetic gamma-ray beam attenuation technique (241Am; 59,5 keV; 100 mCi) the influence of vinasse on the water movement in the soil was studied. For this, an automatic acquisition and handling data system was used, based in multichannel analyser, multi-scaling mode operated, coupled to a personal microcomputer and plotter. Despite the small depth studied (6 cm), it was observed that vinasse decreases the water infiltration velocity in the soil. (Author) [pt

  1. Prototyping sensor network system for automatic vital signs collection. Evaluation of a location based automated assignment of measured vital signs to patients.

    Science.gov (United States)

    Kuroda, T; Noma, H; Naito, C; Tada, M; Yamanaka, H; Takemura, T; Nin, K; Yoshihara, H

    2013-01-01

    Development of a clinical sensor network system that automatically collects vital sign and its supplemental data, and evaluation the effect of automatic vital sensor value assignment to patients based on locations of sensors. The sensor network estimates the data-source, a target patient, from the position of a vital sign sensor obtained from a newly developed proximity sensing system. The proximity sensing system estimates the positions of the devices using a Bluetooth inquiry process. Using Bluetooth access points and the positioning system newly developed in this project, the sensor network collects vital sign and its 4W (who, where, what, and when) supplemental data from any Bluetooth ready vital sign sensors such as Continua-ready devices. The prototype was evaluated in a pseudo clinical setting at Kyoto University Hospital using a cyclic paired comparison and statistical analysis. The result of the cyclic paired analysis shows the subjects evaluated the proposed system is more effective and safer than POCS as well as paper-based operation. It halves the times for vital signs input and eliminates input errors. On the other hand, the prototype failed in its position estimation for 12.6% of all attempts, and the nurses overlooked half of the errors. A detailed investigation clears that an advanced interface to show the system's "confidence", i.e. the probability of estimation error, must be effective to reduce the oversights. This paper proposed a clinical sensor network system that relieves nurses from vital signs input tasks. The result clearly shows that the proposed system increases the efficiency and safety of the nursing process both subjectively and objectively. It is a step toward new generation of point of nursing care systems where sensors take over the tasks of data input from the nurses.

  2. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    Benkirane, A.; Auger, G.; Chbihi, A.; Bloyet, D.; Plagnol, E.

    1994-01-01

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  3. Data collection

    NARCIS (Netherlands)

    Callaert, J.; Epping, Elisabeth; Federkeil, G.; Jongbloed, Benjamin W.A.; Kaiser, Franciscus; Tijssen, R.; van Vught, F.A.; Ziegele, F.

    2012-01-01

    This chapter describes the data collection instruments used in the development of U-Multirank. The first section is an overview of existing databases – mainly on bibliometrics and patents. The second describes the questionnaires and survey tools used for collecting data from the institutions – at

  4. MOLGENIS/connect: a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks

    Science.gov (United States)

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; van der Velde, K. Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans

    2016-01-01

    Motivation: While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. Results: To address this challenge, we developed MOLGENIS/connect, a semi-automatic system to find, match and pool data from different sources. The system shortlists relevant source attributes from thousands of candidates using ontology-based query expansion to overcome variations in terminology. Then it generates algorithms that transform source attributes to a common target DataSchema. These include unit conversion, categorical value matching and complex conversion patterns (e.g. calculation of BMI). In comparison to human-experts, MOLGENIS/connect was able to auto-generate 27% of the algorithms perfectly, with an additional 46% needing only minor editing, representing a reduction in the human effort and expertise needed to pool data. Availability and Implementation: Source code, binaries and documentation are available as open-source under LGPLv3 from http://github.com/molgenis/molgenis and www.molgenis.org/connect. Contact: m.a.swertz@rug.nl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153686

  5. The effect of a low-speed automatic brake system estimated from real life data.

    Science.gov (United States)

    Isaksson-Hellman, Irene; Lindman, Magdalena

    2012-01-01

    A substantial part of all traffic accidents involving passenger cars are rear-end collisions and most of them occur at low speed. Auto Brake is a feature that has been launched in several passenger car models during the last few years. City Safety is a technology designed to help the driver mitigate, and in certain situations avoid, rear-end collisions at low speed by automatically braking the vehicle.Studies have been presented that predict promising benefits from these kinds of systems, but few attempts have been made to show the actual effect of Auto Brake. In this study, the effect of City Safety, a standard feature on the Volvo XC60 model, is calculated based on insurance claims data from cars in real traffic crashes in Sweden. The estimated claim frequency of rear-end frontal collisions measured in claims per 1,000 insured vehicle years was 23% lower for the City Safety equipped XC60 model than for other Volvo models without the system.

  6. The Effect of a Low-Speed Automatic Brake System Estimated From Real Life Data

    Science.gov (United States)

    Isaksson-Hellman, Irene; Lindman, Magdalena

    2012-01-01

    A substantial part of all traffic accidents involving passenger cars are rear-end collisions and most of them occur at low speed. Auto Brake is a feature that has been launched in several passenger car models during the last few years. City Safety is a technology designed to help the driver mitigate, and in certain situations avoid, rear-end collisions at low speed by automatically braking the vehicle. Studies have been presented that predict promising benefits from these kinds of systems, but few attempts have been made to show the actual effect of Auto Brake. In this study, the effect of City Safety, a standard feature on the Volvo XC60 model, is calculated based on insurance claims data from cars in real traffic crashes in Sweden. The estimated claim frequency of rear-end frontal collisions measured in claims per 1,000 insured vehicle years was 23% lower for the City Safety equipped XC60 model than for other Volvo models without the system. PMID:23169133

  7. Parallel Plate System for Collecting Data Used to Determine Viscosity

    Science.gov (United States)

    Kaukler, William (Inventor); Ethridge, Edwin C. (Inventor)

    2013-01-01

    A parallel-plate system collects data used to determine viscosity. A first plate is coupled to a translator so that the first plate can be moved along a first direction. A second plate has a pendulum device coupled thereto such that the second plate is suspended above and parallel to the first plate. The pendulum device constrains movement of the second plate to a second direction that is aligned with the first direction and is substantially parallel thereto. A force measuring device is coupled to the second plate for measuring force along the second direction caused by movement of the second plate.

  8. Testing ethernet networks for the ATLAS data collection system

    CERN Document Server

    Barnes, F R M; Dobinson, Robert W; Le Vine, M J; Martin, B; Lokier, J; Meirosu, C

    2001-01-01

    This paper reports recent work on Ethernet traffic generation and analysis. We use Gigabit Ethernet NICs running customized embedded software and custom-built 32-port Fast Ethernet boards based on FPGAs to study the behavior of large Ethernet networks. The traffic generation software is able to accommodate many traffic distributions with the ultimate goal of generating traffic that resembles the data collection system of the ATLAS experiment at CERN. A fraction of the 1600 ATLAS detector readout buffers and 600 Level 2 trigger CPUs are emulated in this study with a combination of the Fast Ethernet boards and the Gigabit Ethernet NICs. Each packet is time stamped with a global clock value and therefore we are able to compute an accurate measure of the network latency. Various other information collected from the boards is displayed in real time on a graphical interface.

  9. A new standardized data collection system for interdisciplinary thyroid cancer management: Thyroid COBRA.

    Science.gov (United States)

    Tagliaferri, Luca; Gobitti, Carlo; Colloca, Giuseppe Ferdinando; Boldrini, Luca; Farina, Eleonora; Furlan, Carlo; Paiar, Fabiola; Vianello, Federica; Basso, Michela; Cerizza, Lorenzo; Monari, Fabio; Simontacchi, Gabriele; Gambacorta, Maria Antonietta; Lenkowicz, Jacopo; Dinapoli, Nicola; Lanzotti, Vito; Mazzarotto, Renzo; Russi, Elvio; Mangoni, Monica

    2018-02-21

    The big data approach offers a powerful alternative to Evidence-based medicine. This approach could guide cancer management thanks to machine learning application to large-scale data. Aim of the Thyroid CoBRA (Consortium for Brachytherapy Data Analysis) project is to develop a standardized web data collection system, focused on thyroid cancer. The Metabolic Radiotherapy Working Group of Italian Association of Radiation Oncology (AIRO) endorsed the implementation of a consortium directed to thyroid cancer management and data collection. The agreement conditions, the ontology of the collected data and the related software services were defined by a multicentre ad hoc working-group (WG). Six Italian cancer centres were firstly started the project, defined and signed the Thyroid COBRA consortium agreement. Three data set tiers were identified: Registry, Procedures and Research. The COBRA-Storage System (C-SS) appeared to be not time-consuming and to be privacy respecting, as data can be extracted directly from the single centre's storage platforms through a secured connection that ensures reliable encryption of sensible data. Automatic data archiving could be directly performed from Image Hospital Storage System or the Radiotherapy Treatment Planning Systems. The C-SS architecture will allow "Cloud storage way" or "distributed learning" approaches for predictive model definition and further clinical decision support tools development. The development of the Thyroid COBRA data Storage System C-SS through a multicentre consortium approach appeared to be a feasible tool in the setup of complex and privacy saving data sharing system oriented to the management of thyroid cancer and in the near future every cancer type. Copyright © 2018. Published by Elsevier B.V.

  10. Automatic collection of bovine blood samples | Hale | South African ...

    African Journals Online (AJOL)

    A technique is described which allows automatic collection of jugular venous blood from tethered cows. In this system, blood is pumped continuously from an intravenous cannula which has a double lumen while an anticoagulant is pumped through the second opening. Diluted blood is collected in a fraction collector which ...

  11. Collection and evaluation of salt mixing data with the real time data acquisition system. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Glazer, S.; Chiu, C.; Todreas, N.E.

    1977-09-01

    A minicomputer based real time data acquisition system was designed and built to facilitate data collection during salt mixing tests in mock ups of LMFBR rod bundles. The system represents an expansion of data collection capabilities over previous equipment. It performs steady state and transient monitoring and recording of up to 512 individual electrical resistance probes. Extensive real time software was written to govern all phases of the data collection procedure, including probe definition, probe calibration, salt mixing test data acquisition and storage, and data editing. Offline software was also written to permit data examination and reduction to dimensionless salt concentration maps. Finally, the computer program SUPERENERGY was modified to permit rapid extraction of parameters from dimensionless salt concentration maps. The document describes the computer system, and includes circuit diagrams of all custom built components. It also includes descriptions and listings of all software written, as well as extensive user instructions.

  12. Automatic reactor protection system tester

    International Nuclear Information System (INIS)

    Deliant, J.D.; Jahnke, S.; Raimondo, E.

    1988-01-01

    The object of this paper is to present the automatic tester of reactor protection systems designed and developed by EDF and Framatome. In order, the following points are discussed: . The necessity for reactor protection system testing, . The drawbacks of manual testing, . The description and use of the Framatome automatic tester, . On-site installation of this system, . The positive results obtained using the Framatome automatic tester in France

  13. Maritime over the Horizon Sensor Integration: High Frequency Surface-Wave-Radar and Automatic Identification System Data Integration Algorithm.

    Science.gov (United States)

    Nikolic, Dejan; Stojkovic, Nikola; Lekic, Nikola

    2018-04-09

    To obtain the complete operational picture of the maritime situation in the Exclusive Economic Zone (EEZ) which lies over the horizon (OTH) requires the integration of data obtained from various sensors. These sensors include: high frequency surface-wave-radar (HFSWR), satellite automatic identification system (SAIS) and land automatic identification system (LAIS). The algorithm proposed in this paper utilizes radar tracks obtained from the network of HFSWRs, which are already processed by a multi-target tracking algorithm and associates SAIS and LAIS data to the corresponding radar tracks, thus forming an integrated data pair. During the integration process, all HFSWR targets in the vicinity of AIS data are evaluated and the one which has the highest matching factor is used for data association. On the other hand, if there is multiple AIS data in the vicinity of a single HFSWR track, the algorithm still makes only one data pair which consists of AIS and HFSWR data with the highest mutual matching factor. During the design and testing, special attention is given to the latency of AIS data, which could be very high in the EEZs of developing countries. The algorithm is designed, implemented and tested in a real working environment. The testing environment is located in the Gulf of Guinea and includes a network of HFSWRs consisting of two HFSWRs, several coastal sites with LAIS receivers and SAIS data provided by provider of SAIS data.

  14. Computerized techniques for collecting the radioprotection data

    International Nuclear Information System (INIS)

    Cenusa, V.; Valeca, S.; Guta, C.; Talpalariu, C.; Stoica, V.

    2016-01-01

    An important component of a computerized radioprotection system is the module for the collection of the radioprotection data. The data collection can be made automatically from the measurement equipment or manually by the operators after they read the values measured by the mobile devices. Database systems are used for storing the data, they offer higher performances, more efficient data organization, ensure data integrity and controlled access to the data into a multiuser environment. The experimental program for the automatic collection of the remote data transfers periodically, at programmable time intervals, data files from the fixed radiation monitoring stations to a centralized system for radioprotection data. For this is used the File Transfer Protocol (FTP). A Radiation Monitoring Equipment designed and assembled in the Electronics Department of ICN Pitesti was used as a data source for the testing of the experimental programs. (authors)

  15. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  16. Designing a Method for AN Automatic Earthquake Intensities Calculation System Based on Data Mining and On-Line Polls

    Science.gov (United States)

    Liendo Sanchez, A. K.; Rojas, R.

    2013-05-01

    Seismic intensities can be calculated using the Modified Mercalli Intensity (MMI) scale or the European Macroseismic Scale (EMS-98), among others, which are based on a serie of qualitative aspects related to a group of subjective factors that describe human perception, effects on nature or objects and structural damage due to the occurrence of an earthquake. On-line polls allow experts to get an overview of the consequences of an earthquake, without going to the locations affected. However, this could be a hard work if the polls are not properly automated. Taking into account that the answers given to these polls are subjective and there is a number of them that have already been classified for some past earthquakes, it is possible to use data mining techniques in order to automate this process and to obtain preliminary results based on the on-line polls. In order to achieve these goal, a predictive model has been used, using a classifier based on a supervised learning techniques such as decision tree algorithm and a group of polls based on the MMI and EMS-98 scales. It summarized the most important questions of the poll, and recursive divides the instance space corresponding to each question (nodes), while each node splits the space depending on the possible answers. Its implementation was done with Weka, a collection of machine learning algorithms for data mining tasks, using the J48 algorithm which is an implementation of the C4.5 algorithm for decision tree models. By doing this, it was possible to obtain a preliminary model able to identify up to 4 different seismic intensities with 73% correctly classified polls. The error obtained is rather high, therefore, we will update the on-line poll in order to improve the results, based on just one scale, for instance the MMI. Besides, the integration of automatic seismic intensities methodology with a low error probability and a basic georeferencing system, will allow to generate preliminary isoseismal maps

  17. Definition of an automatic information retrieval system independent from the data base used

    International Nuclear Information System (INIS)

    Cunha, E.R.

    1983-04-01

    A bibliographic information retrieval system using data stored at the standardized interchange format ISO 2709 or ANSI Z39.2, is specified. A set of comands for interchange format manipulation wich allows the data access at the logical level, achieving the data independence, are used. A data base description language, a storage structure and data base manipulation comands are specified, using retrieval techniques which consider the applications needs. (Author) [pt

  18. Revealing the Linkage Network Dynamic Structures of Chinese Maritime Ports through Automatic Information System Data

    Directory of Open Access Journals (Sweden)

    Hongchu Yu

    2017-10-01

    Full Text Available Marine economic cooperation has emerged as a major theme in this era of globalization; hence, maritime network connectivity and dynamics have attracted more and more attention. Port construction and maritime route improvements increase maritime trade and thus facilitate economic viability and resource sustainability. This paper reveals the regional dimension of inter-port linkage dynamic structure of Chinese maritime ports from a complex multilayer perspective that is meaningful for strategic forecasting and regional long-term economic development planning. In this research, Automatic Information System (AIS-derived traffic flows were used to construct a maritime network and subnetworks based on the geographical locations of ports. The linkage intensity between subnetworks, the linkage tightness within subnetworks, the spatial isolation between high-intensity backbones and tight skeleton networks, and a linkage concentration index for each port were calculated. The ports, in turn, were analyzed based on these network attributes. This study analyzed the external competitiveness and internal cohesion of each subnetwork. The results revealed problems in port management and planning, such as unclear divisions in port operations. More critically, weak complementary relationships between the backbone and skeleton networks among the ports reduce connectivity and must be strengthened. This research contributes to the body of work supporting strategic decision-making for future development.

  19. Wireless data collection system for real-time arterial travel time estimates.

    Science.gov (United States)

    2011-03-01

    This project pursued several objectives conducive to the implementation and testing of a Bluetooth (BT) based system to collect travel time data, including the deployment of a BT-based travel time data collection system to perform comprehensive testi...

  20. 15 CFR 911.5 - NOAA Data Collection Systems Use Agreements.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false NOAA Data Collection Systems Use... POLICIES AND PROCEDURES CONCERNING USE OF THE NOAA SPACE-BASED DATA COLLECTION SYSTEMS § 911.5 NOAA Data Collection Systems Use Agreements. (a)(1) In order to use a NOAA DCS, each user must have an agreement with...

  1. Fully automatic CNC machining production system

    Directory of Open Access Journals (Sweden)

    Lee Jeng-Dao

    2017-01-01

    Full Text Available Customized manufacturing is increasing years by years. The consumption habits change has been cause the shorter of product life cycle. Therefore, many countries view industry 4.0 as a target to achieve more efficient and more flexible automated production. To develop an automatic loading and unloading CNC machining system via vision inspection is the first step in industrial upgrading. CNC controller is adopted as the main controller to command to the robot, conveyor, and other equipment in this study. Moreover, machine vision systems are used to detect position of material on the conveyor and the edge of the machining material. In addition, Open CNC and SCADA software will be utilized to make real-time monitor, remote system of control, alarm email notification, and parameters collection. Furthermore, RFID has been added to employee classification and management. The machine handshaking has been successfully proposed to achieve automatic vision detect, edge tracing measurement, machining and system parameters collection for data analysis to accomplish industrial automation system integration with real-time monitor.

  2. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  6. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  7. The Italian information system on zoonoses data collection.

    Science.gov (United States)

    Colangeli, P; Iannetti, S; Ruocco, L; Forlizzi, L; Cioci, D; Calistri, P

    2013-03-01

    In the framework of the international obligations subscribed by the Italian government, the Italian Ministry of Health should provide the European Union (EU) (European Commission, European Food Safety Authority - EFSA) with a set of data and information related to the report and the spread of zoonoses and to the activities put in place for monitoring and control of zoonoses. In 2008, the Italian Ministry of Health commissioned the Istituto G. Caporale (ICT) to implement an information system able to provide information and data on the monitoring and control of zoonoses in the national territory, in accordance with the national and community legislation. The system is part of the e-Government process that involves all public administrations of the EU and refers to the use of information and communication technologies for the digital processing of documents in order to obtain simplification and interoperability of administrative procedures through the Internet, as defined in the strategic lines published by the National Centre for Information Systems in Public Administration (DigitPA) in 2009-2011. © 2012 Blackwell Verlag GmbH.

  8. 15 CFR 911.4 - Use of the NOAA Data Collection Systems.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Use of the NOAA Data Collection... POLICIES AND PROCEDURES CONCERNING USE OF THE NOAA SPACE-BASED DATA COLLECTION SYSTEMS § 911.4 Use of the NOAA Data Collection Systems. (a) Use of the NOAA DCS will only be authorized in accordance with the...

  9. 77 FR 56212 - Federal Acquisition Regulation; Information Collection; Use of Data Universal Numbering System...

    Science.gov (United States)

    2012-09-12

    ... Regulation; Information Collection; Use of Data Universal Numbering System (DUNS) as Primary Contractor... November 13, 2012. ADDRESSES: Submit comments identified by Information Collection 9000- 0145, Use of data... ``Information Collection 9000-0145, Use of Data Universal Numbering System (DUNS) as Primary Contractor...

  10. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    Tsikrika, T.; Diou, C.; De Vries, A.P.; Delopoulos, A.

    2010-01-01

    Automatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the expensive

  11. Reliability and effectiveness of clickthrough data for automatic image annotation

    NARCIS (Netherlands)

    T. Tsikrika (Theodora); C. Diou; A.P. de Vries (Arjen); A. Delopoulos

    2010-01-01

    htmlabstractAutomatic image annotation using supervised learning is performed by concept classifiers trained on labelled example images. This work proposes the use of clickthrough data collected from search logs as a source for the automatic generation of concept training data, thus avoiding the

  12. The collection and analysis of transient test data using the mobile instrumentation data acquisition system (MIDAS)

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Arviso, M.

    1995-01-01

    Packages designed to transport radioactive materials are required to survive exposure to environments defined in Code of Federal Regulations. Cask designers can investigate package designs through structural and thermal testing of full-scale packages, components, or representative models. The acquisition of reliable response data from instrumentation measurement devices is an essential part of this testing activity. Sandia National Laboratories, under the sponsorship of the US Department of Energy (DOE), has developed the Mobile Instrumentation Data Acquisition System (MIDAS) dedicated to the collection and processing of structural and thermal data from regulatory tests

  13. Development of a multiplexer for an automatic data acquisition system for the control and monitoring of microbiological cultures

    Energy Technology Data Exchange (ETDEWEB)

    Morales Rondon, A.; Paredes Puente, J.; Arana Alonso, S.

    2016-07-01

    An automatic data acquisition system has been developed for the control and monitoring of microbiological cultures. Turning an otherwise time-consuming process into a smooth one, by allowing the researcher to set the parameters at the beginning of the experiment and move on into the next task. The development of the hardware and software are key to achieving this system. The mux is custom-made with 22 channels, light weight therefore easy to move around the lab. Furthermore, the software allows the researcher to check the measurements in real-time. It is based on virtual instrumentation software therefore new features can be added easily, thus, the mux is capable of adapting to the scientist necessities. (Author)

  14. 75 FR 59686 - Proposed Information Collection; Comment Request; NOAA Space-Based Data Collection System (DCS...

    Science.gov (United States)

    2010-09-28

    ...., Washington, DC 20230 (or via the Internet at [email protected] ). FOR FURTHER INFORMATION CONTACT: Requests for... Environmental Satellite (GOES) DCS and the Polar-Orbiting Operational Environmental Satellite (POES) DCS, also... Collection Submittal include Internet, facsimile transmission and postal mailing of paper forms. III. Data...

  15. Automatically controlled training systems

    International Nuclear Information System (INIS)

    Milashenko, A.; Afanasiev, A.

    1990-01-01

    This paper reports that the computer system for NPP personnel training was developed for training centers in the Soviet Union. The system should be considered as the first step in training, taking into account that further steps are to be devoted to part-task and full scope simulator training. The training room consists of 8-12 IBM PC/AT personal computers combined into a network. A trainee accesses the system in a dialor manner. Software enables the instructor to determine the trainee's progress in different subjects of the program. The quality of any trainee preparedness may be evaluated by Knowledge Control operation. Simplified dynamic models are adopted for separate areas of the program. For example, the system of neutron flux monitoring has a dedicated model. Currently, training, requalification and support of professional qualifications of nuclear power plant operators is being emphasized. A significant number of emergency situations during work are occurring due to operator errors. Based on data from September-October 1989, more than half of all unplanned drops in power and stoppages of power plants were due to operator error. As a comparison, problems due to equipment malfunction accounted for no more than a third of the total. The role of personnel, especially of the operators, is significant during normal operations, since energy production costs as well as losses are influenced by the capability of the staff. These facts all point to the importance of quality training of personnel

  16. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  17. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O’Donovan, Peter; O’Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  18. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user.

  19. Effect of external disturbances and data rate on the response of an automatic landing system capable of curved trajectories

    Science.gov (United States)

    Sherman, W. L.

    1975-01-01

    The effects of steady wind, turbulence, data sample rate, and control-actuator natural frequency on the response of a possible automatic landing system were investigated in a nonstatistical study. The results indicate that the system, which interfaces with the microwave landing system, functions well in winds and turbulence as long as the guidance law contains proper compensation for wind. The system response was satisfactory down to five data samples per second, which makes the system compatible with the microwave landing system. No adverse effects were observed when actuator natural frequency was lowered. For limiting cases, those cases where the roll angle goes to zero just as the airplane touches down, the basic method for computing the turn-algorithm gains proved unsatisfactory and unacceptable landings resulted. Revised computation methods gave turn-algorithm gains that resulted in acceptable landings. The gains provided by the new method also improved the touchdown conditions for acceptable landings over those obtained when the gains were determined by the old method.

  20. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  1. 78 FR 6081 - Agency Information Collection Activities; Comment Request; National Student Loan Data System (NSLDS)

    Science.gov (United States)

    2013-01-29

    ...; Comment Request; National Student Loan Data System (NSLDS) AGENCY: Federal Student Aid (FSA), Department... of Collection: National Student Loan Data System (NSLDS). OMB Control Number: 1845-0035. Type of... data through the National Student Loan Data System (NSLDS) system from postsecondary schools, Perkins...

  2. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  3. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  4. Real time Aanderaa current meter data collection system

    Digital Repository Service at National Institute of Oceanography (India)

    AshokKumar, K.; Diwan, S.G.

    in laboratory. In this paper a method is described to read the real time current meter data and display/print/store on cartridge. For this, binary coded electrical signal available at the top end plate of the current meter is connectEd. by underwater cable...

  5. A Customizable and Expandable Electroencephalography (EEG) Data Collection System

    Science.gov (United States)

    2016-03-01

    monitoring the brain’s electrical activity, known as electroencephalography (EEG). Studies have shown encouraging results in the areas of medicine and...devices, including Emotiv Systems3 and Advanced Brain Monitoring,4 as well as open source alternatives such as OpenBCI.5 These products generally...channels were tested and the results were the same. (a) Gain = 1 (b) Gain = 2 Fig. 5 Measured output with gain = a) 1, b) 2, c) 12, and d

  6. Automatic multi-modal intelligent seizure acquisition (MISA) system for detection of motor seizures from electromyographic data and motion data

    DEFF Research Database (Denmark)

    Conradsen, Isa; Beniczky, Sándor; Wolf, Peter

    2012-01-01

    The objective is to develop a non-invasive automatic method for detection of epileptic seizures with motor manifestations. Ten healthy subjects who simulated seizures and one patient participated in the study. Surface electromyography (sEMG) and motion sensor features were extracted as energy...

  7. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    Science.gov (United States)

    2017-01-01

    Atmospheric Administration (NOAA) tides and currents applications program interface ( API ): http://tidesandcurrents.noaa.gov/ api /. AIS data AIS...files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...McKinney, W. 2012. Python for data analysis. Sebastopol, CA: O’Reilly Media, Inc. Mitchell, K. N. April. 2012. A review of coastal navigation asset

  8. Designing automatic resupply systems.

    Science.gov (United States)

    Harding, M L

    1999-02-01

    This article outlines the process for designing and implementing autoresupply systems. The planning process includes determination of goals and appropriate participation. Different types of autoresupply mechanisms include kanban, breadman, consignment, systems contracts, and direct shipping from an MRP schedule.

  9. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Directory of Open Access Journals (Sweden)

    T. A. Boden

    2013-06-01

    Full Text Available The Carbon Dioxide Information Analysis Center (CDIAC at Oak Ridge National Laboratory (ORNL, USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent

  10. The AmeriFlux data activity and data system: an evolving collection of data management techniques, tools, products and services

    Science.gov (United States)

    Boden, T. A.; Krassovski, M.; Yang, B.

    2013-06-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database

  11. Integrated Display and Simulation for Automatic Dependent Surveillance-Broadcast and Traffic Collision Avoidance System Data Fusion.

    Science.gov (United States)

    Wang, Yanran; Xiao, Gang; Dai, Zhouyun

    2017-11-13

    Automatic Dependent Surveillance-Broadcast (ADS-B) is the direction of airspace surveillance development. Research analyzing the benefits of Traffic Collision Avoidance System (TCAS) and ADS-B data fusion is almost absent. The paper proposes an ADS-B minimum system from ADS-B In and ADS-B Out. In ADS-B In, a fusion model with a variable sampling Variational Bayesian-Interacting Multiple Model (VSVB-IMM) algorithm is proposed for integrated display and an airspace traffic situation display is developed by using ADS-B information. ADS-B Out includes ADS-B Out transmission based on a simulator platform and an Unmanned Aerial Vehicle (UAV) platform. This paper describes the overall implementation of ADS-B minimum system, including theoretical model design, experimental simulation verification, engineering implementation, results analysis, etc. Simulation and implementation results show that the fused system has better performance than each independent subsystem and it can work well in engineering applications.

  12. Automatic classification of municipal call data for quantitative urban drainage system analysis

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.; Harder, R.C.; Loog, M.

    2010-01-01

    Flooding in urban areas can be caused by heavy rainfall, improper planning or component failures. Quantification of these various causes to urban flood probability supports prioritisation of flood risk reduction measures. In many cases, a lack of data on flooding incidents impedes quantification of

  13. Automatic TLI recognition system, general description

    Energy Technology Data Exchange (ETDEWEB)

    Lassahn, G.D.

    1997-02-01

    This report is a general description of an automatic target recognition system developed at the Idaho National Engineering Laboratory for the Department of Energy. A user`s manual is a separate volume, Automatic TLI Recognition System, User`s Guide, and a programmer`s manual is Automatic TLI Recognition System, Programmer`s Guide. This system was designed as an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system naturally incorporates image data fusion, and it gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. In addition to its primary function as a trainable target recognition system, this is also a versatile, general-purpose tool for image manipulation and analysis, which can be either keyboard-driven or script-driven. This report includes descriptions of three variants of the computer hardware, a description of the mathematical basis if the training process, and a description with examples of the system capabilities.

  14. Wireless data collection system for travel time estimation and traffic performance evaluation.

    Science.gov (United States)

    2012-05-01

    This report presents the results of the third and final research and development project of an implementable wireless : travel time data collection system. Utilizing Bluetooth wireless technology as a platform, the prior projects focused on : data co...

  15. Special Data Collection System (SDCS) event report, Unimak Island region, 16 May 1975

    Energy Technology Data Exchange (ETDEWEB)

    Hill, K.J.; Dawkins, M.S.; Baumstark, R.R.; Gillespie, M.D.

    1976-01-22

    Seismic data are reported from the Special Data Collection System (SDCS) and other sources for the Unimak Island Region event on 16 May 1975. Published epicenter information from seismic observations is given.

  16. Mobile robot teleoperation system for plant inspection based on collecting and utilizing environment data

    International Nuclear Information System (INIS)

    Kawabata, Kuniaki; Watanabe, Nobuyasu; Asama, Hajime; Kita, Nobuyuki; Yang, Hai-quan

    2004-01-01

    This paper describes about development of a mobile robot teleoperation system for plant inspection. In our system, the robot is an agent for collecting the environment data and is also teleoperated by the operator utilizing such accumulated environment data which is displayed on the operation interface. The robot equips many sensors for detecting the state of the robot and the environment. Such redundant sensory system can be also utilized to collect the working environment data on-site while the robot is patrolling. Here, proposed system introduces the framework of collecting and utilizing environment data for adaptive plant inspection using the teleoperated robot. A view simulator is primarily aiming to facilitate evaluation of the visual sensors and algorithms and is also extended as the Environment Server, which is the core technology of the digital maintenance field for the plant inspection. In order to construct detailed seamless digital maintenance field mobile robotic technology is utilized to supply environment data to the server. The sensory system on the robot collect the environment data on-site and such collected data is uploaded to the Environment Server for compiling accurate digital environment data base. The robot operator also can utilize accumulated environment data by referring to the Environment Server. In this paper, we explain the concept of our teleoperation system based on collecting and utilizing environment data. Using developed system, inspection patrol experiments were attempted in the plant mock-up. Experimental results are shown by using an omnidirectional mobile robot with sensory system and the Environment Server. (author)

  17. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00473067; The ATLAS collaboration; Serfon, Cedric; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas; Javurek, Tomas

    2017-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration, has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence...

  18. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas

    2016-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for over a year. However, with the forthcoming start of run-2 and its expected workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing data imbalances as well as storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only ...

  19. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  20. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    Energy Technology Data Exchange (ETDEWEB)

    Vega, J., E-mail: jesus.vega@ciemat.e [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion. Avda. Complutense, 22, 28040 Madrid (Spain); Murari, A. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); Ratta, G.A.; Gonzalez, S. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Asociacion EURATOM/CIEMAT para Fusion. Avda. Complutense, 22, 28040 Madrid (Spain); Dormido-Canto, S. [JET-EFDA, Culham Science Center, OX14 3DB, Abingdon (United Kingdom); Dpto. Informatica y Automatica, UNED, Madrid (Spain)

    2010-07-15

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  1. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  2. The LANSCE (Los Alamos Neutron Scattering Center) target data collection system

    International Nuclear Information System (INIS)

    Kernodle, A.K.

    1989-01-01

    The Los Alamos Neutron Scattering Center (LANSCE) Target Data Collection System is the result of an effort to provide a base of information from which to draw conclusions on the performance and operational condition of the overall LANSCE target system. During the conceptualization of the system, several goals were defined. A survey was made of both custom-made and off-the-shelf hardware and software that were capable of meeting these goals. The first stage of the system was successfully implemented for the LANSCE run cycle 52. From the operational experience gained thus far, it appears that the LANSCE Target Data Collection System will meet all of the previously defined requirements

  3. Learning Diagnostic Diagrams in Transport-Based Data-Collection Systems

    DEFF Research Database (Denmark)

    Tran, Vu The; Eklund, Peter; Cook, Chris

    2014-01-01

    Insights about service improvement in a transit network can be gained by studying transit service reliability. In this paper, a general procedure for constructing a transit service reliability diagnostic (Tsrd) diagram based on a Bayesian network is proposed to automatically build a behavioural...... model from Automatic Vehicle Location (AVL) and Automatic Passenger Counters (APC) data. Our purpose is to discover the variability of transit service attributes and their effects on traveller behaviour. A Tsrd diagram describes and helps to analyse factors affecting public transport by combining domain...... knowledge with statistical data....

  4. 40 CFR 141.533 - What data must my system collect to calculate a disinfection profile?

    Science.gov (United States)

    2010-07-01

    ... calculate a disinfection profile? 141.533 Section 141.533 Protection of Environment ENVIRONMENTAL PROTECTION... Filtration and Disinfection-Systems Serving Fewer Than 10,000 People Disinfection Profile § 141.533 What data must my system collect to calculate a disinfection profile? Your system must monitor the following...

  5. Calibration of automatic performance measures - speed and volume data : volume 1, evaluation of the accuracy of traffic volume counts collected by microwave sensors.

    Science.gov (United States)

    2015-09-01

    Over the past few years, the Utah Department of Transportation (UDOT) has developed a system called the : Signal Performance Metrics System (SPMS) to evaluate the performance of signalized intersections. This system : currently provides data summarie...

  6. Portable data collection terminal in the automated power consumption measurement system

    Science.gov (United States)

    Vologdin, S. V.; Shushkov, I. D.; Bysygin, E. K.

    2018-01-01

    Aim of efficiency increasing, automation process of electric energy data collection and processing is very important at present time. High cost of classic electric energy billing systems prevent from its mass application. Udmurtenergo Branch of IDGC of Center and Volga Region developed electronic automated system called “Mobile Energy Billing” based on data collection terminals. System joins electronic components based on service-oriented architecture, WCF services. At present time all parts of Udmurtenergo Branch electric network are connected to “Mobile Energy Billing” project. System capabilities are expanded due to flexible architecture.

  7. Ground truth data collection on mining industrial explosions registered by the International Monitoring System

    International Nuclear Information System (INIS)

    Ehl'tekov, A.Yu.; Gordon, V.P.; Firsov, V.A.; Chervyakov, V.B.

    2004-01-01

    The presentation is dedicated to organizational and technical issues connected with the task of Comprehensive Test-Ban-Treaty Organization timely notification on large chemical explosions including data on explosion location and time, on applied explosive substance quantity and type, and also on configuration and assumed purpose of explosion. Explosions registered by International Monitoring System are of special interest. Their data could be used for calibration of the monitoring system. Ground truth data collection and some explosions location results on Russia's mining enterprises were given. Ground truth data collection peculiarities according to mining industrial explosions were considered. (author)

  8. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  9. Cost and implementation analysis of a personal digital assistant system for laboratory data collection.

    Science.gov (United States)

    Blaya, J A; Gomez, W; Rodriguez, P; Fraser, H

    2008-08-01

    One hundred and twenty-six public health centers and laboratories in Lima, Peru, without internet. We have previously shown that a personal digital assistant (PDA) based system reduces data collection delays and errors for tuberculosis (TB) laboratory results when compared to a paper system. To assess the data collection efficiency of each system and the resources required to develop, implement and transfer the PDA-based system to a resource-poor setting. Time-motion study of data collectors using the PDA-based and paper systems. Cost analysis of developing, implementing and transferring the PDA-based system to a local organization and their redeployment of the system. Work hours spent collecting and processing results decreased by 60% (P collecting patient weights was $4107. A PDA-based system drastically reduced the effort required to collect TB laboratory results from remote locations. With the framework described, open-source software and local development, organizations in resource-poor settings could reap the benefits of this technology.

  10. Master data extraction and adaptation based on collected production data in manufacturing execution systems

    OpenAIRE

    Dimitrov, T.; Baumann, M.; Schenk, M.

    2010-01-01

    This paper presents an approach to extraction and correction of manufacturing master data, needed by Manufacturing Execution Systems (MES) to control and schedule the production. The implementation of the created schedule and the improvement of Key Performance Indicators depends strongly on the quality of the master data. The master data of most enterprises ages or the enterprises cannot fully provide it, because a highly manual expense for statistical analysis and administration is needed. T...

  11. Advances in automatic data analysis capabilities

    International Nuclear Information System (INIS)

    Benson, J.; Bipes, T.; Udpa, L.

    2009-01-01

    Utilities perform eddy current tests on nuclear power plant steam generator (SG) tubes to detect degradation. This paper summarizes the Electric Power Research Institute (EPRI) research to develop signal-processing algorithms that automate the analysis of eddy current test data. The research focuses on analyzing rotating probe and array probe data for detecting, classifying, and characterizing degradation in SG tubes. Automated eddy current data analysis systems for bobbin coil probe data have been available for more than a decade. However, automated data analysis systems for rotating and array probes have developed slowly because of the complexities of the inspection parameters associated with the data. Manual analysis of rotating probe data has been shown to be inconsistent and time consuming when flaw depth profiles are generated. Algorithms have been developed for detection of most all common steam generator degradation mechanisms. Included in the latest version of the developed software is the ability to perform automated defect profiling which is useful in tube integrity determinations. Profiling performed manually can be time consuming whereas automated profiling is performed in a fraction of the time and is much more repeatable. Recent advances in eddy current probe development have resulted in an array probe design capable of high-speed data acquisition over the full length of SG tubes. Probe qualification programs have demonstrated that array probes are capable of providing similar degradation detection capabilities to the rotating probe technology. However, to date, utilities have not used the array probe in the field on a large-scale basis due to the large amount of data analyst resources and time required to process the vast quantity of data generated by the probe. To address this obstacle, EPRI initiated a program to develop automatic data analysis algorithms for rotating and array probes. During the development process for both rotating and array

  12. Automatic TLI recognition system. Part 1: System description

    Energy Technology Data Exchange (ETDEWEB)

    Partin, J.K.; Lassahn, G.D.; Davidson, J.R.

    1994-05-01

    This report describes an automatic target recognition system for fast screening of large amounts of multi-sensor image data, based on low-cost parallel processors. This system uses image data fusion and gives uncertainty estimates. It is relatively low cost, compact, and transportable. The software is easily enhanced to expand the system`s capabilities, and the hardware is easily expandable to increase the system`s speed. This volume gives a general description of the ATR system.

  13. CHLOE: a system for the automatic handling of spark pictures

    International Nuclear Information System (INIS)

    Butler, J.W.; Hodges, D.; Royston, R.

    The system for automatic data handling uses commercially available or state-of-the-art components. The system is flexible enough to accept information from various types of experiments involving photographic data acquisition

  14. Development of a research-oriented system for collecting mechanical ventilator waveform data.

    Science.gov (United States)

    Rehm, Gregory B; Kuhn, Brooks T; Delplanque, Jean-Pierre; Guo, Edward C; Lieng, Monica K; Nguyen, Jimmy; Anderson, Nicholas R; Adams, Jason Y

    2017-10-28

    Lack of access to high-frequency, high-volume patient-derived data, such as mechanical ventilator waveform data, has limited the secondary use of these data for research, quality improvement, and decision support. Existing methods for collecting these data are obtrusive, require high levels of technical expertise, and are often cost-prohibitive, limiting their use and scalability for research applications. We describe here the development of an unobtrusive, open-source, scalable, and user-friendly architecture for collecting, transmitting, and storing mechanical ventilator waveform data that is generalizable to other patient care devices. The system implements a software framework that automates and enforces end-to-end data collection and transmission. A web-based data management application facilitates nontechnical end users' abilities to manage data acquisition devices, mitigates data loss and misattribution, and automates data storage. Using this integrated system, we have been able to collect ventilator waveform data from >450 patients as part of an ongoing clinical study. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. A Machine Vision System for Automatically Grading Hardwood Lumber - (Proceedings)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; Philip A. Araman; Robert L. Brisbon

    1990-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  16. Automatization of hydrodynamic modelling in a Floreon+ system

    Science.gov (United States)

    Ronovsky, Ales; Kuchar, Stepan; Podhoranyi, Michal; Vojtek, David

    2017-07-01

    The paper describes fully automatized hydrodynamic modelling as a part of the Floreon+ system. The main purpose of hydrodynamic modelling in the disaster management is to provide an accurate overview of the hydrological situation in a given river catchment. Automatization of the process as a web service could provide us with immediate data based on extreme weather conditions, such as heavy rainfall, without the intervention of an expert. Such a service can be used by non scientific users such as fire-fighter operators or representatives of a military service organizing evacuation during floods or river dam breaks. The paper describes the whole process beginning with a definition of a schematization necessary for hydrodynamic model, gathering of necessary data and its processing for a simulation, the model itself and post processing of a result and visualization on a web service. The process is demonstrated on a real data collected during floods in our Moravian-Silesian region in 2010.

  17. Development of user-friendly and interactive data collection system for cerebral palsy.

    Science.gov (United States)

    Raharjo, I; Burns, T G; Venugopalan, J; Wang, M D

    2016-02-01

    Cerebral palsy (CP) is a permanent motor disorder that appears in early age and it requires multiple tests to assess the physical and mental capabilities of the patients. Current medical record data collection systems, e.g., EPIC, employed for CP are very general, difficult to navigate, and prone to errors. The data cannot easily be extracted which limits data analysis on this rich source of information. To overcome these limitations, we designed and prototyped a database with a graphical user interface geared towards clinical research specifically in CP. The platform with MySQL and Java framework is reliable, secure, and can be easily integrated with other programming languages for data analysis such as MATLAB. This database with GUI design is a promising tool for data collection and can be applied in many different fields aside from CP to infer useful information out of the vast amount of data being collected.

  18. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    Full Text Available Purpose of the study. The scientific and educational organizations use traditionally e-mail with Microsoft Excel spreadsheets and Microsoft Word documents for operational data collection. The disadvantages of this approach include the lack of control of the correctness of the data input, the complexity of processing the information received due to non-relational data model, etc. There are online services that enable to organize the collection of data in a relational form. The disadvantages of these systems are: the absence of thesaurus support; a limited set of elements of data input control; the limited control the operation of the input form; most of the systems is shareware, etc. Thus, it is required the development of Internet data collection and analysis technology, which should allow to identify quickly model the data collected and automatically implement data collection in accordance with this model.Materials and methods. The article describes the technology developed and tested for operational data collection and analysis using "Faramant" system. System operation "Faramant" is based on a model document, which includes three components: description of the data structure; visualization; logic of form work. All stages of the technology are performed by the user using the browser. The main stage of the proposed technology is the definition of the data model as a set of relational tables. To create a table within the system it’s required to determine the name and a list of fields. For each field, you must specify its name and use the control to the data input and logic of his work. Controls are used to organize the correct input data depending on the data type. Based on a model system "Faramant" automatically creates a filling form, using which users can enter information. To change the form visualization, you can use the form template. The data can be viewed page by page in a table. For table rows, you can apply different filters. To

  19. Intelligent Storage System Based on Automatic Identification

    Directory of Open Access Journals (Sweden)

    Kolarovszki Peter

    2014-09-01

    Full Text Available This article describes RFID technology in conjunction with warehouse management systems. Article also deals with automatic identification and data capture technologies and each processes, which are used in warehouse management system. It describes processes from entering goods into production to identification of goods and also palletizing, storing, bin transferring and removing goods from warehouse. Article focuses on utilizing AMP middleware in WMS processes in Nowadays, the identification of goods in most warehouses is carried through barcodes. In this article we want to specify, how can be processes described above identified through RFID technology. All results are verified by measurement in our AIDC laboratory, which is located at the University of Žilina, and also in Laboratory of Automatic Identification Goods and Services located in GS1 Slovakia. The results of our research bring the new point of view and indicate the ways using of RFID technology in warehouse management system.

  20. Automatic rebalancing of data in ATLAS distributed data management

    Science.gov (United States)

    Barisits, M.; Serfon, C.; Garonne, V.; Lassnig, M.; Beermann, T.; Javurek, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration, has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence also the workload management and production systems. This contribution describes the concept and architecture behind those components and shows the benefits made by the system.

  1. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Kim, Jae Hee; Eom, Heung Seop; Lee, Jae Cheol; Choi, Yoo Raek; Moon, Soon Seung

    2002-02-01

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  2. Integrated straight - through automatic non-destructive examination and data acquisition system for thin-wall tubes

    International Nuclear Information System (INIS)

    Stoessel, A.; Boulanger, G.; Furlan, J.; Mogavero, R.

    1981-09-01

    This non-destructive testing unit inspects the cladding tubes for the SUPER-PHENIX fast neutron reactor. The quality level demanded for these tubes, as well as their number, required designing an installation that combined high performance with a great testing rate and complete automation. The testing is effected under immersion by means of six transducers, focused in line, working at 30 MHz. The tubes are numbered on an automatic rig; marking is by dark rings obtained by superficial electrolysis of the tube and regularly distributed on the abscissa; the quality of the tube is not affected by this. The advantage of this numbering system is that it enables the tubes to be fed to the test set in any order. An acquisition unit, constituted of a microprocessor, a semi-graphical printer and a double floppy disk unit, makes it possible to enter, edit and store the information for each tube [fr

  3. Design of information systems for population data collection based on client-server at Bagolo village

    Science.gov (United States)

    Nugraha, Ucu

    2017-06-01

    Village is the level under the sub-district level in the governmental system in a region where the information system of population data service is majority provided in a manual system. However, such systems frequently lead to invalid data in addition to the available data that does not correspond to the facts as the impact of frequent errors in the process of data collection related to population including the data of the elderly and the process of data transfer. Similarly, the data correspondences such as death certificate, birth certificate, a certificate of domicile change, and so forth, have their own problems. Data archives are frequently non-systematic because they are not organized properly or not stored in a database. Nevertheless, information service system for population census at this level can assist government agencies, especially in the management of population census at the village level. A designed system can make the process of a population census easier. It is initiated by the submission of population letter by each citizen who comes to the village administrative office. Population census information system based on client-server at Bagolo Village was designed in effective and non-complicated workflow and interface design. By using the client-server as the basis, the data will be stored centrally on the server, so it can reduce data duplication and data loss. Therefore, when the local governments require data information related to the population data of a village, they can obtain it easily without the need to collect the data directly at the respective village.

  4. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.

    2009-09-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

  5. MOLGENIS/connect : a system for semi-automatic integration of heterogeneous phenotype data with applications in biobanks

    NARCIS (Netherlands)

    Pang, Chao; van Enckevort, David; de Haan, Mark; Kelpin, Fleur; Jetten, Jonathan; Hendriksen, Dennis; de Boer, Tommy; Charbon, Bart; Winder, Erwin; Velde, van der K. Joeri; Doiron, Dany; Fortier, Isabel; Hillege, Hans; Swertz, Morris A.

    2016-01-01

    Motivation: While the size and number of biobanks, patient registries and other data collections are increasing, biomedical researchers still often need to pool data for statistical power, a task that requires time-intensive retrospective integration. Results: To address this challenge, we developed

  6. A mobile field-work data collection system for the wireless era of health surveillance.

    Science.gov (United States)

    Forsell, Marianne; Sjögren, Petteri; Renard, Matthew; Johansson, Olle

    2011-03-01

    In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on fieldwork experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS) to a Local Area Network (LAN) interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from fieldwork data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  7. Evaluating New York City's abortion reporting system: insights for public health data collection systems.

    Science.gov (United States)

    Toprani, Amita; Madsen, Ann; Das, Tara; Gambatese, Melissa; Greene, Carolyn; Begier, Elizabeth

    2014-01-01

    New York City (NYC) mandates reporting of all abortion procedures. These reports enable tracking of abortion incidence and underpin programs, policy, and research. Since January 2011, the majority of abortion facilities must report electronically. We conducted an evaluation of NYC's abortion reporting system and its transition to electronic reporting. We summarize the evaluation methodology and results and draw lessons relevant to other vital statistics and public health reporting systems. The evaluation followed Centers for Disease Control and Prevention guidelines for evaluating public health surveillance systems. We interviewed key stakeholders and conducted a data provider survey. In addition, we compared the system's abortion counts with external estimates and calculated the proportion of missing and invalid values for each variable on the report form. Finally, we assessed the process for changing the report form and estimated system costs. NYC Health Department's Bureau of Vital Statistics. Usefulness, simplicity, flexibility, data quality, acceptability, sensitivity, timeliness, and stability of the abortion reporting system. Ninety-five percent of abortion data providers considered abortion reporting important; 52% requested training regarding the report form. Thirty percent reported problems with electronic biometric fingerprint certification, and 18% reported problems with the electronic system's stability. Estimated system sensitivity was 88%. Of 17 variables, education and ancestry had more than 5% missing values in 2010. Changing the electronic reporting module was costly and time-consuming. System operating costs were estimated at $80 136 to $89 057 annually. The NYC abortion reporting system is sensitive and provides high-quality data, but opportunities for improvement include facilitating biometric certification, increasing electronic platform stability, and conducting ongoing outreach and training for data providers. This evaluation will help data

  8. Development of Inspection Data Collection and Evaluation System (IDES) for J-MOX (1)

    International Nuclear Information System (INIS)

    Kumakura, Shinichi; Takizawa, Koji; Masuda, Shoichiro; Iso, Shoko; Kikuchi, Masahiro; Hisamatsu, Yoshinori; Kurobe, Hiroko; Kawasue, Akane

    2012-01-01

    'Inspection Data and Collection and Evaluation System' is the system to storage inspection data and operator declaration data collected from various measurement equipments, which are installed in fuel fabrication processes of the large-scale MOX fuel fabrication plant, and to make safeguards evaluation using these data. Nuclear Material Control Center is now developing this system under the project commissioned by JSGO. By last fiscal year, we developed the simulator to simulate fuel fabrication process and generate data simulating in-process material inventory/flow and these measurement data. In addition, we developed a verification evaluation system to calculate various statistics from the simulation data and conduct statistical tests such as NRTA in order to verify the adequacy of material accountancy for the fabrication process. We are currently investigating the adequacy of evaluation itself and effects for evaluation by changing various process factors including unmeasured inventories as well as the adequacy of current safeguards approach. In the presentation, we explain the developed system configuration, calculation method of the simulation etc. and demonstrate same examples of the simulated result on material flow in the fabrication process and a part of the analytical results. (author)

  9. Nebraska data collection.

    Science.gov (United States)

    2015-12-01

    Automated pavement performance data collection is a method that uses advanced technology to collect detailed road surface : distress information at traffic speed. Agencies are driven to use automated survey techniques to enhance or replace their : cu...

  10. Semi-automatic Data Integration using Karma

    Science.gov (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  11. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  12. Research on the Method of Big Data Collecting, Storing and Analyzing of Tongue Diagnosis System

    Science.gov (United States)

    Chen, Xiaowei; Wu, Qingfeng

    2018-03-01

    This paper analyzes the contents of the clinical data of tongue diagnosis of TCM (Traditional Chinese Medicine), and puts forward a method to collect, store and analyze the clinical data of tongue diagnosis. Under the guidance of TCM theory of syndrome differentiation and treatment, this method combines with Hadoop, which is a distributed computing system with strong expansibility, and integrates the functions of analysis and conversion of big data of clinic tongue diagnosis. At the same time, the consistency, scalability and security of big data in tongue diagnosis are realized.

  13. Steps towards single source--collecting data about quality of life within clinical information systems.

    Science.gov (United States)

    Fritz, Fleur; Ständer, Sonja; Breil, Bernhard; Dugas, Martin

    2010-01-01

    Information about the quality of life from patients being treated in routine medical care is important for the attending physician. This data is also needed in research for example to evaluate the therapy and the course of the disease respectively. Especially skin diseases often negatively affect the quality of life. Therefore we aimed to design a concept to collect such data during treatment and use it for both medical care and research in the setting of dermatology. We performed a workflow analysis and implemented a designated form using the tools of the local clinical information system. Quality of life data is now collected within the clinical information system during treatment and is used for discharge letters, progress overviews as well as research about the treatment and course of disease. This concept which contributes to the single source approach was feasible within dermatology and is ready to be expanded into other domains.

  14. Telescience Data Collection Radar

    National Research Council Canada - National Science Library

    Beckner, Frederick

    2000-01-01

    Report developed under SBIR contract for topic AF99-258. The feasibility of developing a telescience data collection radar to reduce the cost of gathering aircraft signature data for noncooperative identification programs is investigated...

  15. A mobile field-work data collection system for the wireless era of health surveillance

    Directory of Open Access Journals (Sweden)

    Marianne Forsell

    2011-02-01

    Full Text Available In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on field-work experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS to a Local Area Network (LAN interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from field-work data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  16. 76 FR 58301 - Proposed Extension of Existing Information Collection; Automatic Fire Sensor and Warning Device...

    Science.gov (United States)

    2011-09-20

    ... Information Collection; Automatic Fire Sensor and Warning Device Systems; Examination and Test Requirements ACTION: Notice of request for public comments. SUMMARY: The Mine Safety and Health Administration (MSHA... public comment version of this information collection package. FOR FURTHER INFORMATION CONTACT: Roslyn B...

  17. Research on wireless communication technology based on automatic logistics system of welder

    Directory of Open Access Journals (Sweden)

    Sun Xuan

    2018-01-01

    Full Text Available In order to meet the requirements of high real-time and high stability of data transmission in automatic welding system, RTU data format and real-time communication mechanism are adopted in this system. In the automatic logistics system through the Ethernet and wireless WIFI technology will palletizer, stacker, AGV car organically together to complete the palletizer automatic crawling the goods, AGV car automatic delivery, stacking machine automatically out of the Dimensional warehouse. .

  18. Research on wireless communication technology based on automatic logistics system of welder

    OpenAIRE

    Sun Xuan; Wang Zhi-yong; Ma Zhe-dong

    2018-01-01

    In order to meet the requirements of high real-time and high stability of data transmission in automatic welding system, RTU data format and real-time communication mechanism are adopted in this system. In the automatic logistics system through the Ethernet and wireless WIFI technology will palletizer, stacker, AGV car organically together to complete the palletizer automatic crawling the goods, AGV car automatic delivery, stacking machine automatically out of the Dimensional warehouse. .

  19. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system.

    Science.gov (United States)

    Marshall, Harry H; Griffiths, David J; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G F; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L; Thompson, Faye J; Vitikainen, Emma I K; Cant, Michael A

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects.

  20. Data collection in a live mass casualty incident simulation: automated RFID technology versus manually recorded system.

    Science.gov (United States)

    Ingrassia, Pier Luigi; Carenzo, Luca; Barra, Federico Lorenzo; Colombo, Davide; Ragazzoni, Luca; Tengattini, Marco; Prato, Federico; Geddo, Alessandro; Della Corte, Francesco

    2012-02-01

    To demonstrate the applicability and the reliability of a radio frequency identification (RFID) system to collect data during a live exercise. A rooftop collapse of a crowded building was simulated. Fifty-three volunteers were trained to perform as smart victims, simulating clinical conditions, using dynamic data cards, and capturing delay times and triage codes. Every victim was also equipped with a RFID tag. RFID antenna was placed at the entrance of the advanced medical post (AMP) and emergency department (ED) and recorded casualties entering the hospital. A total of 12 victims entered AMP and 31 victims were directly transferred to the ED. 100% (12 of 12 and 31 of 31) of the time cards reported a manually written hospital admission time. No failures occurred in tag reading or data transfers. A correlation analysis was performed between the two methods plotting the paired RFID and manual times and resulted in a r=0.977 for the AMP and r=0.986 for the ED with a P value of less than 0.001. We confirmed the applicability of RFID system to the collection of time delays. Its use should be investigated in every aspect of data collection (triage, treatments) during a disaster exercise.

  1. Development of a system for data collection and processing by telemetry

    International Nuclear Information System (INIS)

    Tavares Filho, R.F.

    1983-01-01

    The environmental impact of nuclear industry is, obviously, a matter of the greatest concern. On account of that, a large number of parameters must be recorded during long periods with a high level of confidence. The site selection of brazilian nuclear power plants is conducted under this philosophy. Data acquisition of ocean related parameters in remote, non explored, areas is rather stringent. In order to avoid a series of problems with data collection and processing, a telemetric system concept was developed. Electronic aspects of this system are, mainly, emphasized. For such purpose the system is splitted into two sub-systems: the former for data collection, signal conditionning and transmission and the latter for signal reception and treatment. All parts of the systems were tested in the laboratory before an integrated check, the corresponding results being encouraging. The whole equipment was installed one year ago at the sea shore region of Peruibe, state of Sao Paulo, and is in operation, adequately, eversince. (Author) [pt

  2. A continuous flow from sample collection to data acceptability determination using an automated system

    International Nuclear Information System (INIS)

    Fisk, J.F.; Leasure, C.; Sauter, A.D.

    1993-01-01

    In its role as regulator, EPA is the recipient of enormous reams of analytical data, especially within the Superfund Program. In order to better manage the volume of paper that comes in daily, Superfund has required its laboratories to provide data that is contained on reporting forms to be delivered also on a diskette for uploading into data bases for various purposes, such as checking for contractual compliance, tracking quality assurance parameters, and, ultimately, for reviewing the data by computer. This last area, automated review of the data, has generated programs that are not necessarily appropriate for use by clients other than Superfund. Such is the case with Los Alamos National Laboratory's Environmental Chemistry Group and its emerging subcontractor community, designed to meet the needs of the remedial action program at LANL. LANL is in the process of implementing an automated system that will be used from the planning stage of sample collection to the production of a project-specific report on analytical data quality. Included are electronic scheduling and tracking of samples, data entry, checking and transmission, data assessment and qualification for use, and report generation that will tie the analytical data quality back to the performance criteria defined prior to sample collection. Industry standard products will be used (e.g., ORACLE, Microsoft Excel) to ensure support for users, prevent dependence on proprietary software, and to protect LANL's investment for the future

  3. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  4. The AmeriFlux Data Activity and Data System: An Evolving Collection of Data Management Techniques, Tools, Products and Services

    Energy Technology Data Exchange (ETDEWEB)

    Boden, Thomas A [ORNL; Krassovski, Misha B [ORNL; Yang, Bai [ORNL

    2013-01-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the U.S. Department of Energy and international climate change science since 1982. Over this period, climate change science has expanded from research focusing on basic understanding of geochemical cycles, particularly the carbon cycle, to integrated research addressing climate change impacts, vulnerability, adaptation, and mitigation. Interests in climate change data and information worldwide have grown remarkably and, as a result, so have demands and expectations for CDIAC s data systems. To meet the growing demands, CDIAC s strategy has been to design flexible data systems using proven technologies blended with new, evolving technologies and standards. CDIAC development teams are multidisciplinary and include computer science and information technology expertise, but also scientific expertise necessary to address data quality and documentation issues and to identify data products and system capabilities needed by climate change scientists. CDIAC has learned there is rarely a single commercial tool or product readily available to satisfy long-term scientific data system requirements (i.e., one size does not fit all and the breadth and diversity of environmental data are often too complex for easy use with commercial products) and typically deploys a variety of tools and data products in an effort to provide credible data freely to users worldwide. Like many scientific data management applications, CDIAC s data systems are highly customized to satisfy specific scientific usage requirements (e.g., developing data products specific for model use) but are also designed to be flexible and interoperable to take advantage of new software engineering techniques, standards (e.g., metadata standards) and tools and to support future Earth system data efforts (e.g., ocean acidification). CDIAC has provided data management

  5. An automatically tuning intrusion detection system.

    Science.gov (United States)

    Yu, Zhenwei; Tsai, Jeffrey J P; Weigert, Thomas

    2007-04-01

    An intrusion detection system (IDS) is a security layer used to detect ongoing intrusive activities in information systems. Traditionally, intrusion detection relies on extensive knowledge of security experts, in particular, on their familiarity with the computer system to be protected. To reduce this dependence, various data-mining and machine learning techniques have been deployed for intrusion detection. An IDS is usually working in a dynamically changing environment, which forces continuous tuning of the intrusion detection model, in order to maintain sufficient performance. The manual tuning process required by current systems depends on the system operators in working out the tuning solution and in integrating it into the detection model. In this paper, an automatically tuning IDS (ATIDS) is presented. The proposed system will automatically tune the detection model on-the-fly according to the feedback provided by the system operator when false predictions are encountered. The system is evaluated using the KDDCup'99 intrusion detection dataset. Experimental results show that the system achieves up to 35% improvement in terms of misclassification cost when compared with a system lacking the tuning feature. If only 10% false predictions are used to tune the model, the system still achieves about 30% improvement. Moreover, when tuning is not delayed too long, the system can achieve about 20% improvement, with only 1.3% of the false predictions used to tune the model. The results of the experiments show that a practical system can be built based on ATIDS: system operators can focus on verification of predictions with low confidence, as only those predictions determined to be false will be used to tune the detection model.

  6. Designing and collecting data for a longitudinal study: the Sleman Health and Demographic Surveillance System (HDSS).

    Science.gov (United States)

    Dewi, Fatwa S T; Choiriyyah, Ifta; Indriyani, Citra; Wahab, Abdul; Lazuardi, Lutfan; Nugroho, Agung; Susetyowati, Susetyowati; Harisaputra, Rosalia K; Santi, Risalia; Lestari, Septi K; Ng, Nawi; Hakimi, Mohammad; Josef, Hari K; Utarini, Adi

    2017-07-01

    This paper describes the methodological considerations of developing an urban Health and Demographic Surveillance System (HDSS), in the Sleman District of Yogyakarta, Indonesia. 1) The Sleman District was selected because it is mostly an urban area. 2) The minimum sample size was calculated to measure infant mortality as the key variable and resulted in a sample of 4942 households. A two-stage cluster sampling procedure with probability proportionate to size was applied; first, 216 Censuses Blocks (CBs) were selected, and second, 25 households in each CB were selected. 3) A baseline survey was started in 2015, and collected data on demographic and economic characteristics and verbal autopsy (VA); the 2nd cycle collected updated demographic data, VA, type of morbidity (communicable and non-communicable diseases, disability and injury) and health access. 4) The data were collected at a home visit through a Computer-Assisted Personal Interview (CAPI) on a tablet device, and the data were transferred to the server through the Internet. 5) The quality control consisted of spot-checks of 5% of interviews to control for adherence to the protocol, re-checks to ensure the validity of the interview, and computer-based data cleaning. 6) A utilization system was designed for policy-makers (government) and researchers. In total, 5147 households participated in the baseline assessment in 2015, and 4996 households participated in the second cycle in 2016 (97.0% response rate). Development of an urban HDSS is possible and is beneficial in providing data complementary to the existing demographic and health information system at local, national and global levels.

  7. Digital signal processing for CdTe detectors using VXIbus data collection systems

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, Daiji; Takahashi, Hiroyuki; Kurahashi, Tomohiko; Iguchi, Tetsuo; Nakazawa, Masaharu

    1996-07-01

    Recently fast signal digitizing technique has been developed, and signal waveforms with very short time periods can be obtained. In this paper, we analyzed each measured pulse which was digitized by an apparatus of this kind, and tried to improve an energy resolution of a CdTe semiconductor detector. The result of the energy resolution for {sup 137}Cs 662 keV photopeak was 13 keV. Also, we developed a fast data collection system based on VXIbus standard, and the counting rate on this system was obtained about 50 counts per second. (author)

  8. A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis

    Directory of Open Access Journals (Sweden)

    José Alberto Benítez

    2017-01-01

    Full Text Available There is a great concern nowadays regarding alcohol consumption and drug abuse, especially in young people. Analyzing the social environment where these adolescents are immersed, as well as a series of measures determining the alcohol abuse risk or personal situation and perception using a number of questionnaires like AUDIT, FAS, KIDSCREEN, and others, it is possible to gain insight into the current situation of a given individual regarding his/her consumption behavior. But this analysis, in order to be achieved, requires the use of tools that can ease the process of questionnaire creation, data gathering, curation and representation, and later analysis and visualization to the user. This research presents the design and construction of a web-based platform able to facilitate each of the mentioned processes by integrating the different phases into an intuitive system with a graphical user interface that hides the complexity underlying each of the questionnaires and techniques used and presenting the results in a flexible and visual way, avoiding any manual handling of data during the process. Advantages of this approach are shown and compared to the previous situation where some of the tasks were accomplished by time consuming and error prone manipulations of data.

  9. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    Science.gov (United States)

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  10. ENT COBRA (Consortium for Brachytherapy Data Analysis): interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy)

    Science.gov (United States)

    Tagliaferri, Luca; Kovács, György; Budrukkar, Ashwini; Guinot, Jose Luis; Hildebrand, Guido; Johansson, Bengt; Monge, Rafael Martìnez; Meyer, Jens E.; Niehoff, Peter; Rovirosa, Angeles; Takàcsi-Nagy, Zoltàn; Dinapoli, Nicola; Lanzotti, Vito; Damiani, Andrea; Soror, Tamer; Valentini, Vincenzo

    2016-01-01

    Purpose Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. Material and methods GEC-ESTRO (Groupe Européen de Curiethérapie – European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Results Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis), Procedures (prediction models and DSS), and Research (radiomics). The COBRA-Storage System (C-SS) is not time-consuming as, thanks to the use of “brokers”, data can be extracted directly from the single center's storage systems through a connection with “structured query language database” (SQL-DB), Microsoft Access®, FileMaker Pro®, or Microsoft Excel®. The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of “on-purpose data projection”. The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called “distributed learning” approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Conclusions Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center's own data storing

  11. ENT COBRA (Consortium for Brachytherapy Data Analysis: interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy

    Directory of Open Access Journals (Sweden)

    Luca Tagliaferri

    2016-08-01

    Full Text Available Purpose : Aim of the COBRA (Consortium for Brachytherapy Data Analysis project is to create a multicenter group (consortium and a web-based system for standardized data collection. Material and methods: GEC-ESTRO (Groupe Européen de Curiethérapie – European Society for Radiotherapy & Oncology Head and Neck (H&N Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Results : Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis, Procedures (prediction models and DSS, and Research (radiomics. The COBRA-Storage System (C-SS is not time-consuming as, thanks to the use of “brokers”, data can be extracted directly from the single center’s storage systems through a connection with “structured query language database” (SQL-DB, Microsoft Access®, FileMaker Pro®, or Microsoft Excel®. The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of “on-purpose data projection”. The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called “distributed learning” approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Conclusions : Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center’s own

  12. ENT COBRA (Consortium for Brachytherapy Data Analysis): interdisciplinary standardized data collection system for head and neck patients treated with interventional radiotherapy (brachytherapy).

    Science.gov (United States)

    Tagliaferri, Luca; Kovács, György; Autorino, Rosa; Budrukkar, Ashwini; Guinot, Jose Luis; Hildebrand, Guido; Johansson, Bengt; Monge, Rafael Martìnez; Meyer, Jens E; Niehoff, Peter; Rovirosa, Angeles; Takàcsi-Nagy, Zoltàn; Dinapoli, Nicola; Lanzotti, Vito; Damiani, Andrea; Soror, Tamer; Valentini, Vincenzo

    2016-08-01

    Aim of the COBRA (Consortium for Brachytherapy Data Analysis) project is to create a multicenter group (consortium) and a web-based system for standardized data collection. GEC-ESTRO (Groupe Européen de Curiethérapie - European Society for Radiotherapy & Oncology) Head and Neck (H&N) Working Group participated in the project and in the implementation of the consortium agreement, the ontology (data-set) and the necessary COBRA software services as well as the peer reviewing of the general anatomic site-specific COBRA protocol. The ontology was defined by a multicenter task-group. Eleven centers from 6 countries signed an agreement and the consortium approved the ontology. We identified 3 tiers for the data set: Registry (epidemiology analysis), Procedures (prediction models and DSS), and Research (radiomics). The COBRA-Storage System (C-SS) is not time-consuming as, thanks to the use of "brokers", data can be extracted directly from the single center's storage systems through a connection with "structured query language database" (SQL-DB), Microsoft Access(®), FileMaker Pro(®), or Microsoft Excel(®). The system is also structured to perform automatic archiving directly from the treatment planning system or afterloading machine. The architecture is based on the concept of "on-purpose data projection". The C-SS architecture is privacy protecting because it will never make visible data that could identify an individual patient. This C-SS can also benefit from the so called "distributed learning" approaches, in which data never leave the collecting institution, while learning algorithms and proposed predictive models are commonly shared. Setting up a consortium is a feasible and practicable tool in the creation of an international and multi-system data sharing system. COBRA C-SS seems to be well accepted by all involved parties, primarily because it does not influence the center's own data storing technologies, procedures, and habits. Furthermore, the method

  13. Development of automatic ultrasonic testing system and its application

    International Nuclear Information System (INIS)

    Oh, Sang Hong; Matsuura, Toshihiko; Iwata, Ryusuke; Nakagawa, Michio; Horikawa, Kohsuke; Kim, You Chul

    1997-01-01

    The radiographic testing (RT) has been usually applied to a nondestructive testing, which is carried out on purpose to detect internal defects at welded joints of a penstock. In the case that RT could not be applied to, the ultrasonic testing (UT) was performed. UT was generally carried out by manual scanning and the inspections data were recorded by the inspector in a site. So, as a weak point, there was no objective inspection records correspond to films of RT. It was expected that the automatic ultrasonic testing system by which automatic scanning and automatic recording are possible was developed. In this respect, the automatic ultrasonic testing system was developed. Using newly developed the automatic ultrasonic testing system, test results to the circumferential welded joints of the penstock at a site were shown in this paper.

  14. Feasibility Study for Ballet E-Learning: Automatic Composition System for Ballet "Enchainement" with Online 3D Motion Data Archive

    Science.gov (United States)

    Umino, Bin; Longstaff, Jeffrey Scott; Soga, Asako

    2009-01-01

    This paper reports on "Web3D dance composer" for ballet e-learning. Elementary "petit allegro" ballet steps were enumerated in collaboration with ballet teachers, digitally acquired through 3D motion capture systems, and categorised into families and sub-families. Digital data was manipulated into virtual reality modelling language (VRML) and fit…

  15. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  16. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  17. Artefact in Physiological Data Collected from Patients with Brain Injury: Quantifying the Problem and Providing a Solution Using a Factorial Switching Linear Dynamical Systems Approach.

    Science.gov (United States)

    Georgatzis, Konstantinos; Lal, Partha; Hawthorne, Christopher; Shaw, Martin; Piper, Ian; Tarbert, Claire; Donald, Rob; Williams, Christopher K I

    2016-01-01

    High-resolution, artefact-free and accurately annotated physiological data are desirable in patients with brain injury both to inform clinical decision-making and for intelligent analysis of the data in applications such as predictive modelling. We have quantified the quality of annotation surrounding artefactual events and propose a factorial switching linear dynamical systems (FSLDS) approach to automatically detect artefact in physiological data collected in the neurological intensive care unit (NICU). Retrospective analysis of the BrainIT data set to discover potential hypotensive events corrupted by artefact and identify the annotation of associated clinical interventions. Training of an FSLDS model on clinician-annotated artefactual events in five patients with severe traumatic brain injury. In a subset of 187 patients in the BrainIT database, 26.5 % of potential hypotensive events were abandoned because of artefactual data. Only 30 % of these episodes could be attributed to an annotated clinical intervention. As assessed by the area under the receiver operating characteristic curve metric, FSLDS model performance in automatically identifying the events of blood sampling, arterial line damping and patient handling was 0.978, 0.987 and 0.765, respectively. The influence of artefact on physiological data collected in the NICU is a significant problem. This pilot study using an FSLDS approach shows real promise and is under further development.

  18. Using global positioning systems in health research a practical approach to data collection and processing

    DEFF Research Database (Denmark)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper

    2011-01-01

    The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons across...... studies and populations, and advance this field more rapidly. This paper aims to take researchers, who are considering using GPS devices in their research, through device-selection criteria, device settings, participant data collection, data cleaning, data processing, and integration of data into GIS...

  19. Profiling animal toxicants by automatically mining public bioassay data: a big data approach for computational toxicology.

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    Full Text Available In vitro bioassays have been developed and are currently being evaluated as potential alternatives to traditional animal toxicity models. Already, the progress of high throughput screening techniques has resulted in an enormous amount of publicly available bioassay data having been generated for a large collection of compounds. When a compound is tested using a collection of various bioassays, all the testing results can be considered as providing a unique bio-profile for this compound, which records the responses induced when the compound interacts with different cellular systems or biological targets. Profiling compounds of environmental or pharmaceutical interest using useful toxicity bioassay data is a promising method to study complex animal toxicity. In this study, we developed an automatic virtual profiling tool to evaluate potential animal toxicants. First, we automatically acquired all PubChem bioassay data for a set of 4,841 compounds with publicly available rat acute toxicity results. Next, we developed a scoring system to evaluate the relevance between these extracted bioassays and animal acute toxicity. Finally, the top ranked bioassays were selected to profile the compounds of interest. The resulting response profiles proved to be useful to prioritize untested compounds for their animal toxicity potentials and form a potential in vitro toxicity testing panel. The protocol developed in this study could be combined with structure-activity approaches and used to explore additional publicly available bioassay datasets for modeling a broader range of animal toxicities.

  20. Georeferenced and secure mobile health system for large scale data collection in primary care.

    Science.gov (United States)

    Sa, Joao H G; Rebelo, Marina S; Brentani, Alexandra; Grisi, Sandra J F E; Iwaya, Leonardo H; Simplicio, Marcos A; Carvalho, Tereza C M B; Gutierrez, Marco A

    2016-10-01

    Mobile health consists in applying mobile devices and communication capabilities for expanding the coverage and improving the effectiveness of health care programs. The technology is particularly promising for developing countries, in which health authorities can take advantage of the flourishing mobile market to provide adequate health care to underprivileged communities, especially primary care. In Brazil, the Primary Care Information System (SIAB) receives primary health care data from all regions of the country, creating a rich database for health-related action planning. Family Health Teams (FHTs) collect this data in periodic visits to families enrolled in governmental programs, following an acquisition procedure that involves filling in paper forms. This procedure compromises the quality of the data provided to health care authorities and slows down the decision-making process. To develop a mobile system (GeoHealth) that should address and overcome the aforementioned problems and deploy the proposed solution in a wide underprivileged metropolitan area of a major city in Brazil. The proposed solution comprises three main components: (a) an Application Server, with a database containing family health conditions; and two clients, (b) a Web Browser running visualization tools for management tasks, and (c) a data-gathering device (smartphone) to register and to georeference the family health data. A data security framework was designed to ensure the security of data, which was stored locally and transmitted over public networks. The system was successfully deployed at six primary care units in the city of Sao Paulo, where a total of 28,324 families/96,061 inhabitants are regularly followed up by government health policies. The health conditions observed from the population covered were: diabetes in 3.40%, hypertension (age >40) in 23.87% and tuberculosis in 0.06%. This estimated prevalence has enabled FHTs to set clinical appointments proactively, with the aim of

  1. Usability of Low-Cost Android Data Collection System for Community-Based Participatory Research.

    Science.gov (United States)

    Salihu, Hamisu M; Salinas-Miranda, Abraham; Turner, DeAnne; King, Lindsey; Paothong, Arnut; Austin, Deborah; Berry, Estrellita Lo

    2016-01-01

    Android tablet computers can be valuable tools for data collection, but their usability has not been evaluated in community-based participatory research (CBPR). This article examines the usability of a low-cost bilingual touchscreen computerized survey system using Android tablets, piloted with a sample of 201 community residents in Tampa, Florida, from November 2013 to March 2014. Needs assessment questions were designed with the droidSURVEY software, and deployed using Android tablet computers. In addition, participants were asked questions about system usability. The mean system usability was 77.57 ± 17.66 (range, 0-100). The mean completion time for taking the 63 survey questions in the needs assessment was 23.11 ± 9.62 minutes. The survey completion rate was optimal (100%), with only 6.34% missingness per variable. We found no sociodemographic differences in usability scores. Our findings indicate that Android tablets could serve as useful tools in CBPR studies.

  2. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  3. Data collection on component malfunctions and failures of JET ICRH system

    International Nuclear Information System (INIS)

    Pinna, T.; Cambi, G.

    2007-01-01

    The objective of the activity was to collect and analyse data coming out from operating experiences gained in the Joint European Torus (JET) for the Ion Cyclotron Resonance Heating (ICRH) system in order to enrich the data collection on failures of components used in fusion facilities. Alarms/Failures and malfunctions occurred in the years of operations from March 1996 to November 2005, including information on failure modes and, where possible, causes of the failures, have been identified. Beyond information on failures and alarms events, also data related to crowbar events have been collected. About 3400 events classified as alarms or failures related to specific components or sub-systems were identified by analysing the 25 hand-written logbooks made available by the ICRH operation staff. Information about the JET pulses in which the ICRH system was operated has been extracted from the tick sheets covering the whole considered time interval. 20 hand written tick sheets cover the period from March 1996 to middle May 2003, while tick sheets recorded as excel files cover the period from May 2003 to November 2005. By analysing the tick sheets it results that the ICRH was operated during about 12000 plasma pulses. Main statistical values, such as rates of alarms/failures and corresponding standard errors and confidence intervals, have been estimated. Failure rates of systems and components have been evaluated both with regard to the ICRH operation pulses and operating days (days in which at least one ICRH module was requested to operate). Failure probabilities on demand have been evaluated with regard to number of pulses operated. Some of the results are the following: - The highest number of alarms/failures (1243) appears to be related to Erratic /No-output of the Instrumentation and Control (I and C) apparatus, followed by faults (829) of the Tetrode circuits, by faults (466) of the High Voltage Power Supply system and by faults (428) of the Tuning elements. - The

  4. URBAN DATA COLLECTION USING A BIKE MOBILE SYSTEM WITH A FOSS ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    A. Abdul Jabbar

    2017-07-01

    Full Text Available European community is working to improve the quality of the life in each European country, in particular to increase the quality air condition and safety in each city. The quality air is daily monitored, using several ground station, which do not consider the variation of the quality during the day, evaluating only the average level. In this case, it could be interesting to have a “smart” system to acquire distributed data in continuous, even involving the citizens. On the other hand, to improve the safety level in urban area along cycle lane, road and pedestrian path, exist a lot of algorithms for visibility and safety analysis; the crucial aspect is the 3D model considered as “input” in these algorithms, which always needs to be updated. A bike has been instrumented with two digital camera as Raspberry PI-cam. Image acquisition has been realized with a dedicated python tool, which has been implemented in the Raspberry PI system. Images have been georeferenced using a u-blox 8T, connected to Raspberry system. GNSS data has been acquired using a specific tool developed in Python, which was based on RTKLIB library. Time synchronization has been obtained with GNSS receiver. Additionally, a portable laser scanner, an air quality system and a small Inertial platform have been installed and connected with the Raspberry system. The system has been implemented and tested to acquire data (image and air quality parameter in a district in Turin. Also a 3D model of the investigated site has been carried. In this contribute, the assembling of the system is described, in particular the dataset acquired and the results carried out will be described. different low cost sensors, in particular digital camera and laser scanner to collect easily geospatial data in urban area.

  5. VetCompass Australia: A National Big Data Collection System for Veterinary Science

    Science.gov (United States)

    McGreevy, Paul; Thomson, Peter; Dhand, Navneet K.; Raubenheimer, David; Masters, Sophie; Mansfield, Caroline S.; Baldwin, Timothy; Soares Magalhaes, Ricardo J.; Rand, Jacquie; Hill, Peter; Gilkerson, James; Combs, Martin; Raidal, Shane; Irwin, Peter; Irons, Peter; Squires, Richard; Brodbelt, David; Hammond, Jeremy

    2017-01-01

    Simple Summary The VetCompass Australia program collects real-time clinical records from veterinary practices and aggregates them for researchers to interrogate. It delivers Australian researchers sustainable and cost-effective access to authoritative data from hundreds of veterinary practitioners, across Australia and opens up major international collaborative opportunities with related projects in the United Kingdom and elsewhere. Abstract VetCompass Australia is veterinary medical records-based research coordinated with the global VetCompass endeavor to maximize its quality and effectiveness for Australian companion animals (cats, dogs, and horses). Bringing together all seven Australian veterinary schools, it is the first nationwide surveillance system collating clinical records on companion-animal diseases and treatments. VetCompass data service collects and aggregates real-time, clinical records for researchers to interrogate, delivering sustainable and cost-effective access to data from hundreds of veterinary practitioners nationwide. Analysis of these clinical records will reveal geographical and temporal trends in the prevalence of inherited and acquired diseases, identify frequently prescribed treatments, revolutionize clinical auditing, help the veterinary profession to rank research priorities, and assure evidence-based companion-animal curricula in veterinary schools. VetCompass Australia will progress in three phases: (1) roll-out of the VetCompass platform to harvest Australian veterinary clinical record data; (2) development and enrichment of the coding (data-presentation) platform; and (3) creation of a world-first, real-time surveillance interface with natural language processing (NLP) technology. The first of these three phases is described in the current article. Advances in the collection and sharing of records from numerous practices will enable veterinary professionals to deliver a vastly improved level of care for companion animals that will

  6. VetCompass Australia: A National Big Data Collection System for Veterinary Science.

    Science.gov (United States)

    McGreevy, Paul; Thomson, Peter; Dhand, Navneet K; Raubenheimer, David; Masters, Sophie; Mansfield, Caroline S; Baldwin, Timothy; Soares Magalhaes, Ricardo J; Rand, Jacquie; Hill, Peter; Peaston, Anne; Gilkerson, James; Combs, Martin; Raidal, Shane; Irwin, Peter; Irons, Peter; Squires, Richard; Brodbelt, David; Hammond, Jeremy

    2017-09-26

    VetCompass Australia is veterinary medical records-based research coordinated with the global VetCompass endeavor to maximize its quality and effectiveness for Australian companion animals (cats, dogs, and horses). Bringing together all seven Australian veterinary schools, it is the first nationwide surveillance system collating clinical records on companion-animal diseases and treatments. VetCompass data service collects and aggregates real-time, clinical records for researchers to interrogate, delivering sustainable and cost-effective access to data from hundreds of veterinary practitioners nationwide. Analysis of these clinical records will reveal geographical and temporal trends in the prevalence of inherited and acquired diseases, identify frequently prescribed treatments, revolutionize clinical auditing, help the veterinary profession to rank research priorities, and assure evidence-based companion-animal curricula in veterinary schools. VetCompass Australia will progress in three phases: (1) roll-out of the VetCompass platform to harvest Australian veterinary clinical record data; (2) development and enrichment of the coding (data-presentation) platform; and (3) creation of a world-first, real-time surveillance interface with natural language processing (NLP) technology. The first of these three phases is described in the current article. Advances in the collection and sharing of records from numerous practices will enable veterinary professionals to deliver a vastly improved level of care for companion animals that will improve their quality of life.

  7. VetCompass Australia: A National Big Data Collection System for Veterinary Science

    Directory of Open Access Journals (Sweden)

    Paul McGreevy

    2017-09-01

    Full Text Available VetCompass Australia is veterinary medical records-based research coordinated with the global VetCompass endeavor to maximize its quality and effectiveness for Australian companion animals (cats, dogs, and horses. Bringing together all seven Australian veterinary schools, it is the first nationwide surveillance system collating clinical records on companion-animal diseases and treatments. VetCompass data service collects and aggregates real-time, clinical records for researchers to interrogate, delivering sustainable and cost-effective access to data from hundreds of veterinary practitioners nationwide. Analysis of these clinical records will reveal geographical and temporal trends in the prevalence of inherited and acquired diseases, identify frequently prescribed treatments, revolutionize clinical auditing, help the veterinary profession to rank research priorities, and assure evidence-based companion-animal curricula in veterinary schools. VetCompass Australia will progress in three phases: (1 roll-out of the VetCompass platform to harvest Australian veterinary clinical record data; (2 development and enrichment of the coding (data-presentation platform; and (3 creation of a world-first, real-time surveillance interface with natural language processing (NLP technology. The first of these three phases is described in the current article. Advances in the collection and sharing of records from numerous practices will enable veterinary professionals to deliver a vastly improved level of care for companion animals that will improve their quality of life.

  8. A cost-effective traffic data collection system based on the iDEN mobile telecommunication network.

    Science.gov (United States)

    2008-10-01

    This report describes a cost-effective data collection system for Caltrans 170 traffic signal : controller. The data collection system is based on TCP/IP communication over existing : low-cost mobile communication networks and Motorola iDEN1 mobile...

  9. Creating an iPhone Application for Collecting Continuous ABC Data

    Science.gov (United States)

    Whiting, Seth W.; Dixon, Mark R.

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…

  10. D3.2 Initial Specification of Data Collection and Analysis System

    DEFF Research Database (Denmark)

    Siksnys, Laurynas; Kaulakiene, Dalia; Pedersen, Torben Bach

    2010-01-01

    ) project aims to invent and prototype key elements of an energy system that is better able to accommodate large volumes of electricity from RES. The approach is based on flexible offers that allow an individual consumer/producer to specify when and what amount of energy he or she wants to consume...... panels) cannot be planned, but can only be predicted. Thus, electrical power produced by RES usually does not match the energy consumption, and it must be discarded or given away for free. The MIRACLE (Micro-Request-Based Aggregation, Forecasting and Scheduling of Energy Demand, Supply and Distribution...... with such technology and will be sending tens of flexible offers per day. In order to appropriately manage very large volumes of flexible offers, a reliable, distributed, and highly scalable computer system infrastructure is needed. Work Package 3 concerns data collection, aggregation and storage solutions...

  11. Resources monitoring and automatic management system for multi-VO distributed computing system

    Science.gov (United States)

    Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.

    2017-10-01

    Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.

  12. Data collection system for a wide range of gas-discharge proportional neutron counters

    Science.gov (United States)

    Oskomov, V.; Sedov, A.; Saduyev, N.; Kalikulov, O.; Kenzhina, I.; Tautaev, E.; Mukhamejanov, Y.; Dyachkov, V.; Utey, Sh

    2017-12-01

    This article describes the development and creation of a universal system of data collection to measure the intensity of pulsed signals. As a result of careful analysis of time conditions and operating conditions of software and hardware complex circuit solutions were selected that meet the required specifications: frequency response is optimized in order to obtain the maximum ratio signal/noise; methods and modes of operation of the microcontroller were worked out to implement the objectives of continuous measurement of signal amplitude at the output of amplifier and send the data to a computer; function of control of high voltage source was implemented. The preliminary program has been developed for microcontroller in its simplest form, which works on a particular algorithm.

  13. submitter Optimizing the data-collection time of a large-scale data-acquisition system through a simulation framework

    CERN Document Server

    Colombo, Tommaso; Garcìa, Pedro Javier; Vandelli, Wainer

    2016-01-01

    The ATLAS detector at CERN records particle collision “events” delivered by the Large Hadron Collider. Its data-acquisition system identifies, selects, and stores interesting events in near real-time, with an aggregate throughput of several 10 GB/s. It is a distributed software system executed on a farm of roughly 2000 commodity worker nodes communicating via TCP/IP on an Ethernet network. Event data fragments are received from the many detector readout channels and are buffered, collected together, analyzed and either stored permanently or discarded. This system, and data-acquisition systems in general, are sensitive to the latency of the data transfer from the readout buffers to the worker nodes. Challenges affecting this transfer include the many-to-one communication pattern and the inherently bursty nature of the traffic. The main performance issues brought about by this workload are addressed in this paper, focusing in particular on the so-called TCP incast pathology. Since performing systematic stud...

  14. WP 2: "Data collection and processing systems (DCPS) for the conventional markets" and WP 3: "Data collection and processing systems for organic markets"

    NARCIS (Netherlands)

    Wolfert, J.; Kramer, K.J.; Richter, T.; Hempfling, G.; Lux, S.; Recke, G.

    2004-01-01

    The aim of the EU concerted action EISfOM (QLK5-2002-02400) (European Information System for Organic Markets) is to build up a framework for reporting valid and reliable data for relevant production and market sectors of the European organic sector in order to meet the needs of policy-makers,

  15. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  16. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  17. Automatic control variac system for electronic accelerator

    International Nuclear Information System (INIS)

    Zhang Shuocheng; Wang Dan; Jing Lan; Qiao Weimin; Ma Yunhai

    2006-01-01

    An automatic control variac system is designed in order to satisfy the controlling requirement of the electronic accelerator developed by the Institute. Both design and operational principles, structure of the system as well as the software of industrial PC and micro controller unit are described. The interfaces of the control module are RS232 and RS485. A fiber optical interface (FOC) could be set up if an industrial FOC network is necessary, which will extend the filed of its application and make the communication of the system better. It is shown in practice that the system can adjust the variac output voltage automatically and assure the accurate and automatic control of the electronic accelerator. The system is designed in accordance with the general design principles and possesses the merits such as easy operation and maintenance, good expansibility, and low cost, thus it could also be used in other industrial branches. (authors)

  18. Creating an iPhone application for collecting continuous ABC data.

    Science.gov (United States)

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.

  19. Research and Application of an Automatic Clam Collecting Device

    Directory of Open Access Journals (Sweden)

    Wang Bin

    2017-01-01

    Full Text Available To collect clams automatically and effectively on coastal beach, an automatic clam collecting device was designed. The device consists of a connecting device, a shovelling device, a conveying device, and a filtering device. The mechanical device is designed based on some of the presented devices, such as a blade with slope, a pipelined conveyor belt, planar linkage mechanisms and a ski mechanism. The connecting device is connected to the device body by bolts. The shovelling device adopts a blade with slope, which can reduce the resistance between sandy soil and the device. The transmission device adopts a conveyor belt with a two-stage reducer, which can effectively control the speed of the transmission and avoid the splash of mud. A mesh structure is used for soil filtering, which is designed with a certain slope, thus sandy soil and other impurities fall from the mesh for its own weight. The designed device for clam collecting will improve efficiency and decrease cost effectively.

  20. Development of fully automatic pipe welding system

    International Nuclear Information System (INIS)

    Tanioka, Shin-ichi; Nakano, Mitsuhiro; Tejima, Akio; Yamada, Minoru; Saito, Tatsuo; Saito, Yoshiyuki; Abe, Rikio

    1985-01-01

    We have succeeded in developing a fully automatic TIG welding system; namely CAPTIG that enables unmanned welding operations from the initial layer to the final finishing layer continuously. This welding system is designed for continuous, multilayered welding of thick and large diameter fixed pipes of nuclear power plants and large-size boiler plants where high-quality welding is demanded. In the tests conducted with this welding system, several hours of continuous unmanned welding corroborated that excellent beads are formed, good results are obtained in radiographic inspection and that quality welding is possible most reliably. This system incorporates a microcomputer for fully automatic controls by which it features a seam tracking function, wire feed position automatic control function, a self-checking function for inter-pass temperature, cooling water temperature and wire reserve. (author)

  1. Measuring Container Port Complementarity and Substitutability with Automatic Identification System (AIS Data – Studying the Inter-port Relationships in the Oslo Fjord Multi-port Gateway Region

    Directory of Open Access Journals (Sweden)

    Halvor Schøyen

    2017-06-01

    Full Text Available This paper considers the degree of competition among small and medium-sized container ports located in a multi-port gateway region. The level of port competition is evaluated by means of an analysis of the revealed preferences in the port-calling pattern of container feeder vessels deployed on their various links and routes. Unit of analysis is feeder vessel sailing legs and ports stays at/between adjacent container ports. At these ports’ terminals, ships are moored and loading and unloading of containers are performed. The vessel movement data is provided by the Automatic Identification System (AIS. A study of the principal container ports in the Oslo Fjord area is performed, measuring the actual container feeder traffic during the year of 2015. It is demonstrated to which extent ports in the Oslo Fjord region are acting as substitutes, and to which extent they are functioning more as a complement to each other.

  2. Steam System Balancing and Tuning for Multifamily Residential Buildings in Chicagoland - Second Year of Data Collection

    Energy Technology Data Exchange (ETDEWEB)

    Choi, J.; Ludwig, P.; Brand, L.

    2013-08-01

    Steam heated buildings often suffer from uneven heating as a result of poor control of the amount of steam entering each radiator. In order to satisfy the heating load to the coldest units, other units are overheated. As a result, some tenants complain of being too hot and open their windows in the middle of winter, while others complain of being too cold and are compelled to use supplemental heat sources. Building on previous research, CNT Energy identified 10 test buildings in Chicago and conducted a study to identify best practices for the methodology, typical costs, and energy savings associated with steam system balancing. A package of common steam balancing measures was assembled and data were collected on the buildings before and after these retrofits were installed to investigate the process, challenges, and the cost effectiveness of improving steam systems through improved venting and control systems. The test buildings that received venting upgrades and new control systems showed 10.2% savings on their natural gas heating load, with a simple payback of 5.1 years. The methodologies for and findings from this study are presented in detail in this report. This report has been updated from a version published in August 2012 to include natural gas usage information from the 2012 heating season and updated natural gas savings calculations.

  3. Get SMARTS] (Sports Medicine Research Team System): A Computerized Outpatient Data Collection System for Epidemiologic Research

    National Research Council Canada - National Science Library

    Brodine, S

    1997-01-01

    .... This report describes features of the Sports Medicine Research Team System (SMARTS) and reviews results of a SMARTS supported prospective study of male Marine Corps recruits undergoing basic training...

  4. [Input data collection].

    Science.gov (United States)

    Duval, Julie

    2017-05-01

    The quality and safety of nursing care depends notably on the collection of data. In Quebec, there is a tool which aims to improve the organisation of care in accident and emergency departments. Nurses are on the frontline in the initial assessment of the patient. Unfortunately, their recognition is not on a par with their responsibility. Crown Copyright © 2017. Publié par Elsevier Masson SAS. All rights reserved.

  5. Automatic seismic support design of piping system by an object oriented expert system

    International Nuclear Information System (INIS)

    Nakatogawa, T.; Takayama, Y.; Hayashi, Y.; Fukuda, T.; Yamamoto, Y.; Haruna, T.

    1990-01-01

    The seismic support design of piping systems of nuclear power plants requires many experienced engineers and plenty of man-hours, because the seismic design conditions are very severe, the bulk volume of the piping systems is hyge and the design procedures are very complicated. Therefore we have developed a piping seismic design expert system, which utilizes the piping design data base of a 3 dimensional CAD system and automatically determines the piping support locations and support styles. The data base of this system contains the maximum allowable seismic support span lengths for straight piping and the span length reduction factors for bends, branches, concentrated masses in the piping, and so forth. The system automatically produces the support design according to the design knowledge extracted and collected from expert design engineers, and using design information such as piping specifications which give diameters and thickness and piping geometric configurations. The automatic seismic support design provided by this expert system achieves in the reduction of design man-hours, improvement of design quality, verification of design result, optimization of support locations and prevention of input duplication. In the development of this system, we had to derive the design logic from expert design engineers and this could not be simply expressed descriptively. Also we had to make programs for different kinds of design knowledge. For these reasons we adopted the object oriented programming paradigm (Smalltalk-80) which is suitable for combining programs and carrying out the design work

  6. Implementation and flight tests for the Digital Integrated Automatic Landing System (DIALS). Part 1: Flight software equations, flight test description and selected flight test data

    Science.gov (United States)

    Hueschen, R. M.

    1986-01-01

    Five flight tests of the Digital Automated Landing System (DIALS) were conducted on the Advanced Transport Operating Systems (ATOPS) Transportation Research Vehicle (TSRV) -- a modified Boeing 737 aircraft for advanced controls and displays research. These flight tests were conducted at NASA's Wallops Flight Center using the microwave landing system (MLS) installation on runway 22. This report describes the flight software equations of the DIALS which was designed using modern control theory direct-digital design methods and employed a constant gain Kalman filter. Selected flight test performance data is presented for localizer (runway centerline) capture and track at various intercept angles, for glideslope capture and track of 3, 4.5, and 5 degree glideslopes, for the decrab maneuver, and for the flare maneuver. Data is also presented to illustrate the system performance in the presence of cross, gust, and shear winds. The mean and standard deviation of the peak position errors for localizer capture were, respectively, 24 feet and 26 feet. For mild wind conditions, glideslope and localizer tracking position errors did not exceed, respectively, 5 and 20 feet. For gusty wind conditions (8 to 10 knots), these errors were, respectively, 10 and 30 feet. Ten hands off automatic lands were performed. The standard deviation of the touchdown position and velocity errors from the mean values were, respectively, 244 feet and 0.7 feet/sec.

  7. Using global positioning systems in health research: a practical approach to data collection and processing.

    Science.gov (United States)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper; Schipperjin, Jasper

    2011-11-01

    The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons across studies and populations, and advance this field more rapidly. This paper aims to take researchers, who are considering using GPS devices in their research, through device-selection criteria, device settings, participant data collection, data cleaning, data processing, and integration of data into GIS. Recommendations are outlined for each stage of data collection and analysis and indicates challenges that should be considered. This paper highlights the benefits of collecting GPS data over traditional self-report or estimated exposure measures. Information presented here will allow researchers to make an informed decision about incorporating this readily available technology into their studies. This work reflects the state of the art in 2011. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  8. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J. S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J. P. K.; Geertzen, J. H. B.

    2004-01-01

    This paper describes a new automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitted

  9. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients,

  10. Automatic Water Sensor Window Opening System

    KAUST Repository

    Percher, Michael

    2013-12-05

    A system can automatically open at least one window of a vehicle when the vehicle is being submerged in water. The system can include a water collector and a water sensor, and when the water sensor detects water in the water collector, at least one window of the vehicle opens.

  11. An observing system for the collection of fishery and oceanographic data

    Directory of Open Access Journals (Sweden)

    P. Falco

    2007-05-01

    Full Text Available Fishery Observing System (FOS was developed as a first and basic step towards fish stock abundance nowcasting/forecasting within the framework of the EU research program Mediterranean Forecasting System: Toward an Environmental Prediction (MFSTEP. The study of the relationship between abundance and environmental parameters also represents a crucial point towards forecasting. Eight fishing vessels were progressively equipped with FOS instrumentation to collect fishery and oceanographic data. The vessels belonged to different harbours of the Central and Northern Adriatic Sea. For this pilot application, anchovy (Engraulis encrasicolus, L. was chosen as the target species. Geo-referenced catch data, associated with in-situ temperature and depth, were the FOS products but other parameters were associated with catch data as well. MFSTEP numerical circulation models provide many of these data. In particular, salinity was extracted from re-analysis data of numerical circulation models. Satellite-derived sea surface temperature (SST and chlorophyll were also used as independent variables. Catch and effort data were used to estimate an abundance index (CPUE – Catch per Unit of Effort. Considering that catch records were gathered by different fishing vessels with different technical characteristics and operating on different fish densities, a standardized value of CPUE was calculated. A spatial and temporal average CPUE map was obtained together with a monthly mean time series in order to characterise the variability of anchovy abundance during the period of observation (October 2003–August 2005. In order to study the relationship between abundance and oceanographic parameters, Generalized Additive Models (GAM were used. Preliminary results revealed a complex scenario: the southern sector of the domain is characterised by a stronger relationship than the central and northern sector where the interactions between the environment and the anchovy

  12. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  13. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    Science.gov (United States)

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  14. DataCollection Prototyping

    CERN Multimedia

    Beck, H.P.

    DataCollection is a subsystem of the Trigger, DAQ & DCS project responsible for the movement of event data from the ROS to the High Level Triggers. This includes data from Regions of Interest (RoIs) for Level 2, building complete events for the Event Filter and finally transferring accepted events to Mass Storage. It also handles passing the LVL1 RoI pointers and the allocation of Level 2 processors and load balancing of Event Building. During the last 18 months DataCollection has developed a common architecture for the hardware and software required. This involved a radical redesign integrating ideas from separate parts of earlier TDAQ work. An important milestone for this work, now achieved, has been to demonstrate this subsystem in the so-called Phase 2A Integrated Prototype. This prototype comprises the various TDAQ hardware and software components (ROSs, LVL2, etc.) under the control of the TDAQ Online software. The basic functionality has been demonstrated on small testbeds (~8-10 processing nodes)...

  15. Research on automatic control system of greenhouse

    Science.gov (United States)

    Liu, Yi; Qi, Guoyang; Li, Zeyu; Wu, Qiannan; Meng, Yupeng

    2017-03-01

    This paper introduces a kind of automatic control system of single-chip microcomputer and a temperature and humidity sensor based on the greenhouse, describes the system's hardware structure, working principle and process, and a large number of experiments on the effect of the control system, the results show that the system can ideally control temperature and room temperature and humidity, can be used in indoor breeding and planting, and has the versatility and portability.

  16. Automatic Positioning System of Small Agricultural Robot

    Science.gov (United States)

    Momot, M. V.; Proskokov, A. V.; Natalchenko, A. S.; Biktimirov, A. S.

    2016-08-01

    The present article discusses automatic positioning systems of agricultural robots used in field works. The existing solutions in this area have been analyzed. The article proposes an original solution, which is easy to implement and is characterized by high- accuracy positioning.

  17. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850...

  18. Temperature Profile Data Collected by Participating Ships in NOAA's Shipboard Environmental Data Acquisition System Program from 17 June 2000 to 23 February 2001 (NODC Accession 0000417)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — XBT and other data were collected from the COLUMBUS COROMANDEL and other platforms participating in NOAA's Shipboard Environmental Data Acquisition System (SEAS)...

  19. Automatic Robot Safety Shutdown System

    Science.gov (United States)

    Lirette, M.

    1985-01-01

    Robot turned off if acceleration exceeds preset value. Signals from accelerometer on robot arm pass through filter and amplifier, eliminating high-frequency noise and hydraulic-pump pulsations. Data digitized and processed in computer. Unit controls other machines that perform repetitive movements, including rotary tables, tracked vehicles, conveyor lines, and elevators.

  20. USE OF MOBILE PHONES AS RESEARCH INSTRUMENT FOR DATA COLLECTION

    Directory of Open Access Journals (Sweden)

    AP Pakhare

    2013-08-01

    Full Text Available Data collection is a crucial step in any research design or program. In order to be analysed, this collected data needs to be entered into aspreadsheet or statistical software. Transcribing paper based data is time consuming and often associated with errors. Such errors may be due toan inability to read the data-collector’s handwriting,human mistakes during data entry etc. A system wherein data automatically gets transcribed and uploaded in a database during data collection would be of immense use in this situation. A possible solution for this is mobile phone based data collection, a type of electronic data capture method wherein the processes of data collection and data entry are merged1. Initially electronic data collection was done by hand-helddevices such as Personal Digital Assistants (PDAs. However with the entry of the newer and more sophisticated smartphones in the market,there is a growing possibility of extendingthe success achieved on PDAs to a phone-based platform2. Withthe advent of newer technology software solutions this process can even be done on a standard entry level mobile phone. This paper discusses the use and advantages of using mobile phones for data collection and also provides information about resources for mobile based data collection.

  1. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    Science.gov (United States)

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  2. 49 CFR Appendix H to Part 40 - DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form

    Science.gov (United States)

    2010-10-01

    ..., App. H Appendix H to Part 40—DOT Drug and Alcohol Testing Management Information System (MIS) Data... 49 Transportation 1 2010-10-01 2010-10-01 false DOT Drug and Alcohol Testing Management Information System (MIS) Data Collection Form H Appendix H to Part 40 Transportation Office of the Secretary...

  3. Study on intermediate frequency power supply automatic monitor system

    International Nuclear Information System (INIS)

    Wang Yuntong; Xu Bin

    2007-06-01

    A new design project of the automatic monitor system for the intermediate frequency power supply system by using the communication server is put for- ward and the realizing principle method and the key technique are clarified in detail. This system made use of the conversion function with the series communication server's control, realized the data collecting function by the double machine backup and redundancy. The new network system adopted the photoelectric-insulated-communication connect device and the diagnosis technique, increased the anti-interference ability, the communication adopted the technique by the alarm information sending out in first and circularly repeating, the slowly speed is overcame in the original monitor network system, and strengthened the celerity of the monitor system and the reliability of the alarm report. After the new monitor system running, the result shows that the functions is more perfect than the original monitor system, the usage is more convenient, have the higher and dependable stability, the report of alarm is more quickly, and is convenient for the analysis after the trouble, at the same time, the system still have the strong ability and value to expand. (authors)

  4. Work Zone Data Collection Trailer

    Data.gov (United States)

    Federal Laboratory Consortium — The Work Zone Data Collection Trailer was designed and constructed to enhance data collection and analysis capabilities for the "Evaluating Roadway Construction Work...

  5. Automatic Road Sign Inventory Using Mobile Mapping Systems

    Science.gov (United States)

    Soilán, M.; Riveiro, B.; Martínez-Sánchez, J.; Arias, P.

    2016-06-01

    The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS) use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS), thus, road sign information can be available to be used in a Smart City context.

  6. Two Systems for Automatic Music Genre Recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2012-01-01

    We re-implement and test two state-of-the-art systems for automatic music genre classification; but unlike past works in this area, we look closer than ever before at their behavior. First, we look at specific instances where each system consistently applies the same wrong label across multiple...... trials of cross-validation. Second, we test the robustness of each system to spectral equalization. Finally, we test how well human subjects recognize the genres of music excerpts composed by each system to be highly genre representative. Our results suggest that neither high-performing system has...... a capacity to recognize music genre....

  7. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  8. Multimedia data mining for automatic diabetic retinopathy screening.

    Science.gov (United States)

    Quellec, Gwénolé; Lamard, Mathieu; Cochener, Béatrice; Decencière, Etienne; Lay, Bruno; Chabouis, Agnès; Roux, Christian; Cazuguel, Guy

    2013-01-01

    This paper presents TeleOphta, an automatic system for screening diabetic retinopathy in teleophthalmology networks. Its goal is to reduce the burden on ophthalmologists by automatically detecting non referable examination records, i.e. examination records presenting no image quality problems and no pathological signs related to diabetic retinopathy or any other retinal pathology. TeleOphta is an attempt to put into practice years of algorithmic developments from our groups. It combines image quality metrics, specific lesion detectors and a generic pathological pattern miner to process the visual content of eye fundus photographs. This visual information is further combined with contextual data in order to compute an abnormality risk for each examination record. The TeleOphta system was trained and tested on a large dataset of 25,702 examination records from the OPHDIAT screening network in Paris. It was able to automatically detect 68% of the non referable examination records while achieving the same sensitivity as a second ophthalmologist. This suggests that it could safely reduce the burden on ophthalmologists by 56%.

  9. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  10. Automatic collection of the rare-earths with post chromatography column detection

    International Nuclear Information System (INIS)

    David, P.; Metzger, G.; Repellin, M.

    1987-01-01

    The complete separation of rare-earths (in the aim of radio-isotopes measurement) requires High Performance Liquid Chromatography with ternary elution gradient. To automatize their collection with satisfying conditions, we have realized a non polluting, reliable and easy to operate detection method. This one is based on a derivation colorimetric system with arsenazo I (3 -(2 arsophenylazo 4.5) - dihydroxy - 2.7 naphtalene disulfonic acid)

  11. A Risk Assessment System with Automatic Extraction of Event Types

    Science.gov (United States)

    Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula

    In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.

  12. 49 CFR 236.825 - System, automatic train control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  13. AUTOCLASS III - AUTOMATIC CLASS DISCOVERY FROM DATA

    Science.gov (United States)

    Cheeseman, P. C.

    1994-01-01

    The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5

  14. A semi-automatic system for labelling seafood products and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-05-10

    May 10, 2010 ... This study is on the implementation of a semi-automatic labelling system (LS) of the Mediterranean Sea seafood harvest to address the increased need for seafood authentication and inherent difficulties of commonly used indirect techniques for estimating fisheries yield and fishing effort. Sensitive data.

  15. A semi-automatic system for labelling seafood products and ...

    African Journals Online (AJOL)

    A semi-automatic system for labelling seafood products and obtaining fishery management data: A case study of the bottom trawl fishery in the central ... policies, such as date and catch area, can be acquired and recorded on the label by user-friendly automated software that excludes any possible manipulation by the crew.

  16. Developing an intelligent control system of automatic window motor ...

    Indian Academy of Sciences (India)

    system software can manage diverse sensor data and provide the interface for remote monitoring. Keywords. Intelligent control; automatic window; wireless sensor network (WSN); micro control unit (MCU). 1. Introduction. Incorporating various advanced energy-related technology in the design and construction of new.

  17. 75 FR 27001 - Comment Request for Information Collection for the SCSEP Data Collection System, OMB Control No...

    Science.gov (United States)

    2010-05-13

    ... understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the... 2002 (Pub. L. 07-288); changes in overall burden for some forms based on actual usage statistics; and the requirement to publish changes to the Internet-based SCSEP Performance and Results QPR (SPARQ...

  18. Method of software development for tasks of automatic control systems for simulation and designing on the base of the technological systems design data

    International Nuclear Information System (INIS)

    Ajzatulin, A.I.

    2007-01-01

    One studies the factors affecting the designing of the full-scale simulation facilities, the design data base simulation and the application of digital computerized process control systems. Paper describes problems dealing with the errors in the process system design data and the algorithm simulation methodological problems. On the basis of the records of the efforts to design the full-scale simulation facilities of the Tienvan NPP and of the Kudankulam NPP one brings to the notice a procedure to elaborate new tools to simulate and to elaborate algorithms for the computerized process control systems based on the process system design data. Paper lists the basic components of the program system under elaboration to ensure simulation and designing and describes their functions. The introduction result is briefly described [ru

  19. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  20. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  1. A Context Dependent Automatic Target Recognition System

    Science.gov (United States)

    Kim, J. H.; Payton, D. W.; Olin, K. E.; Tseng, D. Y.

    1984-06-01

    This paper describes a new approach to automatic target recognizer (ATR) development utilizing artificial intelligent techniques. The ATR system exploits contextual information in its detection and classification processes to provide a high degree of robustness and adaptability. In the system, knowledge about domain objects and their contextual relationships is encoded in frames, separating it from low level image processing algorithms. This knowledge-based system demonstrates an improvement over the conventional statistical approach through the exploitation of diverse forms of knowledge in its decision-making process.

  2. The automatic electromagnetic field generating system

    Science.gov (United States)

    Audone, B.; Gerbi, G.

    1982-07-01

    The technical study and the design approaches adopted for the definition of the automatic electromagnetic field generating system (AEFGS) dedicated to EMC susceptibility testing are presented. The AEFGS covers the frequency range 10 KHz to 40 GHZ and operates successfully in the two EMC shielded chambers at ESTEC. The performance of the generators/amplifiers subsystems, antennas selection, field amplitude and susceptibility feedback and monitoring systems is described. System control modes which guarantee the AEFGS full operability under different test conditions are discussed. Advantages of automation of susceptibility testing include increased measurement accuracy and testing cost reduction.

  3. Automatic code generation for distributed robotic systems

    International Nuclear Information System (INIS)

    Jones, J.P.

    1993-01-01

    Hetero Helix is a software environment which supports relatively large robotic system development projects. The environment supports a heterogeneous set of message-passing LAN-connected common-bus multiprocessors, but the programming model seen by software developers is a simple shared memory. The conceptual simplicity of shared memory makes it an extremely attractive programming model, especially in large projects where coordinating a large number of people can itself become a significant source of complexity. We present results from three system development efforts conducted at Oak Ridge National Laboratory over the past several years. Each of these efforts used automatic software generation to create 10 to 20 percent of the system

  4. Recent developments in the Los Alamos National Laboratory Plutonium Facility Waste Tracking System-automated data collection pilot project

    International Nuclear Information System (INIS)

    Martinez, B.; Montoya, A.; Klein, W.

    1999-01-01

    The waste management and environmental compliance group (NMT-7) at the Los Alamos National Laboratory has initiated a pilot project for demonstrating the feasibility and utility of automated data collection as a solution for tracking waste containers at the Los Alamos National Laboratory Plutonium Facility. This project, the Los Alamos Waste Tracking System (LAWTS), tracks waste containers during their lifecycle at the facility. LAWTS is a two-tiered system consisting of a server/workstation database and reporting engine and a hand-held data terminal-based client program for collecting data directly from tracked containers. New containers may be added to the system from either the client unit or from the server database. Once containers are in the system, they can be tracked through one of three primary transactions: Move, Inventory, and Shipment. Because LAWTS is a pilot project, it also serves as a learning experience for all parties involved. This paper will discuss many of the lessons learned in implementing a data collection system in the restricted environment. Specifically, the authors will discuss issues related to working with the PPT 4640 terminal system as the data collection unit. They will discuss problems with form factor (size, usability, etc.) as well as technical problems with wireless radio frequency functions. They will also discuss complications that arose from outdoor use of the terminal (barcode scanning failures, screen readability problems). The paper will conclude with a series of recommendations for proceeding with LAWTS based on experience to date

  5. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  6. Water quality data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP), 1996 - 1998 (NODC Accession 0000789)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality data in 22 reserves in the United States and...

  7. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP), 1994 - 2005 (NODC Accession 0019215)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 25...

  8. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  9. An Automatic Indirect Immunofluorescence Cell Segmentation System

    Directory of Open Access Journals (Sweden)

    Yung-Kuan Chan

    2014-01-01

    Full Text Available Indirect immunofluorescence (IIF with HEp-2 cells has been used for the detection of antinuclear autoantibodies (ANA in systemic autoimmune diseases. The ANA testing allows us to scan a broad range of autoantibody entities and to describe them by distinct fluorescence patterns. Automatic inspection for fluorescence patterns in an IIF image can assist physicians, without relevant experience, in making correct diagnosis. How to segment the cells from an IIF image is essential in developing an automatic inspection system for ANA testing. This paper focuses on the cell detection and segmentation; an efficient method is proposed for automatically detecting the cells with fluorescence pattern in an IIF image. Cell culture is a process in which cells grow under control. Cell counting technology plays an important role in measuring the cell density in a culture tank. Moreover, assessing medium suitability, determining population doubling times, and monitoring cell growth in cultures all require a means of quantifying cell population. The proposed method also can be used to count the cells from an image taken under a fluorescence microscope.

  10. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  11. Computerized ECT data analysis system

    International Nuclear Information System (INIS)

    Miyake, Y.; Fukui, S.; Iwahashi, Y.; Matsumoto, M.; Koyama, K.

    1988-01-01

    For the analytical method of the eddy current testing (ECT) of steam generator tubes in nuclear power plants, the authors have developed the computerized ECT data analysis system using a large-scale computer with a high-resolution color graphic display. This system can store acquired ECT data up to 15 steam generators, and ECT data can be analyzed immediately on the monitor in dialogue communication with a computer. Analyzed results of ECT data are stored and registered in the data base. This system enables an analyst to perform sorting and collecting of data under various conditions and obtain the results automatically, and also to make a plan of tube repair works. This system has completed the test run, and has been used for data analysis at the annual inspection of domestic plants. This paper describes an outline, features and examples of the computerized eddy current data analysis system for steam generator tubes in PWR nuclear power plants

  12. Taxing the cloud: introducing a new taxation system on data collection?

    Directory of Open Access Journals (Sweden)

    Primavera De Filippi

    2013-05-01

    Full Text Available Cloud computing services are increasingly hosted on international servers and distributed amongst multiple data centres. Given their global scope, it is often easier for large multinational corporations to effectively circumvent old taxation schemes designed around the concept of territorial jurisdiction and geographical settings. In view of obtaining tax revenues from these online operators whose business is partially carried out in France, the French government recently issued a report emphasising the need for new taxation rules that would better comply with the way value is generated in the digital economy: at the international level, it is suggested that taxation should be calculated according to the place of interaction with end-users; at the national level, the report suggests to introduce a transitory tax on data collection in order to promote innovation and encourage good online practices.

  13. Wireless data collection system for travel time estimation and traffic performance evaluation.

    Science.gov (United States)

    2010-09-01

    Having accurate and continually updated travel time and other performance data for the road and highway system has many benefits. From the perspective of the road users, having real-time updates on travel times will permit better travel and route pla...

  14. Gamma-ray spectrometry data collection and reduction by simple computing systems

    International Nuclear Information System (INIS)

    Op de Beeck, J.

    1975-01-01

    The review summarizes the present state of the involvement of relatively small computing devices in the collection and processing of gamma-ray spectrum data. An economic and utilitarian point of view has been chosen with regard to data collection in order to arrive at practically valuable conclusions in terms of feasibility of possible configurations with respect to their eventual application. A unified point of view has been adopted with regard to data processing by developing an information theoretical approach on a more or less intuitive level in an attempt to remove the largest part of the virtual disparity between the several processing methods described in the literature. A synoptical introduction to the most important mathematical methods has been incorporated, together with a detailed theoretical description of the concept gamma-ray spectrum. In accordance with modern requirements, the discussions are mainly oriented towards high-resolution semiconductor detector-type spectra. The critical evaluation of the processing methods reviewed is done with respect to a set of predefined criteria. Smoothing, peak detection, peak intensity determination, overlapping peak resolving and detection and upper limits are discussed in great detail. A preferred spectrum analysis method combining powerful data reduction properties with extreme simplicity and speed of operation is suggested. The general discussion is heavily oriented towards activation analysis application, but other disciplines making use of gamma-ray spectrometry will find the material presented equally useful. Final conclusions are given pointing to future developments and shifting their centre of gravity towards improving the quality of the measurements rather than expanding the use of tedious and sophisticated mathematical techniques requiring the limits of available computational power. (author)

  15. Fuzzy Logic Based Automatic Door Control System

    Directory of Open Access Journals (Sweden)

    Harun SUMBUL

    2017-12-01

    Full Text Available In this paper, fuzzy logic based an automatic door control system is designed to provide for heat energy savings. The heat energy loss usually occurs in where outomotic doors are used. Designed fuzzy logic system’s Input statuses (WS: Walking Speed and DD: Distance Door and the output status (DOS: Door Opening Speed is determined. According to these cases, rule base (25 rules is created; the rules are processed by a fuzzy logic and by appyled to control of an automatic door. An interface program is prepared by using Matlab Graphical User Interface (GUI programming language and some sample results are checked on Matlab using fuzzy logic toolbox. Designed fuzzy logic controller is tested at different speed cases and the results are plotted. As a result; in this study, we have obtained very good results in control of an automatic door with fuzzy logic. The results of analyses have indicated that the controls performed with fuzzy logic provided heat energy savings, less heat energy loss and reliable, consistent controls and that are feasible to in real.

  16. Travel time data collection handbook

    Science.gov (United States)

    1998-03-01

    This Travel Time Data Collection Handbook provides guidance to transportation : professionals and practitioners for the collection, reduction, and presentation : of travel time data. The handbook should be a useful reference for designing : travel ti...

  17. Real time psychrometric data collection

    International Nuclear Information System (INIS)

    McDaniel, K.H.

    1996-01-01

    Eight Mine Weather Stations (MWS) installed at the Waste Isolation Pilot Plant (WIPP) to monitor the underground ventilation system are helping to simulate real-time ventilation scenarios. Seasonal weather extremes can result in variations of Natural Ventilation Pressure (NVP) which can significantly effect the ventilation system. The eight MWS(s) (which previously collected and stored temperature, barometric pressure and relative humidity data for subsequent NVP calculations) were upgraded to provide continuous real-time data to the site wide Central monitoring System. This data can now be utilized by the ventilation engineer to create realtime ventilation simulations and trends which assist in the prediction and mitigation of NVP and psychrometric related events

  18. Intelligent Management System of Power Network Information Collection Under Big Data Storage

    Directory of Open Access Journals (Sweden)

    Qin Yingying

    2017-01-01

    Full Text Available With the development of economy and society, big data storage in enterprise management has become a problem that can’t be ignored. How to manage and optimize the allocation of tasks better is an important factor in the sustainable development of an enterprise. Now the enterprise information intelligent management has become a hot spot of management mode and concept in the information age. It presents information to the business managers in a more efficient, lower cost, and global form. The system uses the SG-UAP development tools, which is based on Eclipse development environment, and suits for Windows operating system, with Oracle as database development platform, Tomcat network information service for application server. The system uses SOA service-oriented architecture, provides RESTful style service, and HTTP(S as the communication protocol, and JSON as the data format. The system is divided into two parts, the front-end and the backs-end, achieved functions like user login, registration, password retrieving, enterprise internal personnel information management and internal data display and other functions.

  19. An automatic drawing system for a report radioactive contamination check

    International Nuclear Information System (INIS)

    Saneyoshi, Keiji; Tomita, Satoru; Yoda, Isao

    2002-01-01

    An Automatic drawing system for a report of surface contamination check in a radiation controlled area has been developed. The system can print out the report applied for the format provided by the law from the raw data that is the output from measuring instruments. The task of a worker is only to insert an FD storing the data into a PC and to push a button. The system also yields contamination maps to indicate contamination points clearly. With this system the time to complete the report from the raw data could be decreased from more than two hours to 4 minutes. (author)

  20. Design of USB/RS485 converter and its application in slow control data collection system of high energy physics

    International Nuclear Information System (INIS)

    Chen Xihui; Xie Song; Gao Cuishan; Xie Xiaoxi; Nie Zhendong; Zhang Yinhong; Gao Lu

    2005-01-01

    Most of traditional data collection systems are based on RS232/485 converter. Such a system can only realize the point-to-point connection and its branches cannot work independently. So it brings on lots of inconvenience in debugging, installing and maintenance. On the other hand, the widely used Universal Serial Bus (USB) has many advantages such as hot-plugging, easy to extend, convenient in installing and occupying less system resources. So if USB could be used in data collection system, it would bring on lots of convenience to the system and its branches could be work independently. The design of USB-485 converter and its application are introduced in this paper. (authors)

  1. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    events. Due to great variation in events, this method often fails to detect biologically relevant pressure variations. We have tried to develop a new concept for recognition of pressure events based on a neural network. Pressures were recorded for over 23 hours in 29 normal volunteers by means...... comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....... of a portable data recording system. A number of pressure events and non-events were selected from 9 recordings and used for training the network. The performance of the trained network was then verified on recordings from the remaining 20 volunteers. The accuracy and sensitivity of the two systems were...

  2. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....

  3. Estimating spatial travel times using automatic vehicle identification data

    Science.gov (United States)

    2001-01-01

    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  4. Automatic Visualization of Software Requirements: Reactive Systems

    International Nuclear Information System (INIS)

    Castello, R.; Mili, R.; Tollis, I.G.; Winter, V.

    1999-01-01

    In this paper we present an approach that facilitates the validation of high consequence system requirements. This approach consists of automatically generating a graphical representation from an informal document. Our choice of a graphical notation is statecharts. We proceed in two steps: we first extract a hierarchical decomposition tree from a textual description, then we draw a graph that models the statechart in a hierarchical fashion. The resulting drawing is an effective requirements assessment tool that allows the end user to easily pinpoint inconsistencies and incompleteness

  5. Automatic system for detecting pornographic images

    Science.gov (United States)

    Ho, Kevin I. C.; Chen, Tung-Shou; Ho, Jun-Der

    2002-09-01

    Due to the dramatic growth of network and multimedia technology, people can more easily get variant information by using Internet. Unfortunately, it also makes the diffusion of illegal and harmful content much easier. So, it becomes an important topic for the Internet society to protect and safeguard Internet users from these content that may be encountered while surfing on the Net, especially children. Among these content, porno graphs cause more serious harm. Therefore, in this study, we propose an automatic system to detect still colour porno graphs. Starting from this result, we plan to develop an automatic system to search porno graphs or to filter porno graphs. Almost all the porno graphs possess one common characteristic that is the ratio of the size of skin region and non-skin region is high. Based on this characteristic, our system first converts the colour space from RGB colour space to HSV colour space so as to segment all the possible skin-colour regions from scene background. We also apply the texture analysis on the selected skin-colour regions to separate the skin regions from non-skin regions. Then, we try to group the adjacent pixels located in skin regions. If the ratio is over a given threshold, we can tell if the given image is a possible porno graph. Based on our experiment, less than 10% of non-porno graphs are classified as pornography, and over 80% of the most harmful porno graphs are classified correctly.

  6. Ground-penetrating radar and differential global positioning system data collected from Long Beach Island, New Jersey, April 2015

    Science.gov (United States)

    Zaremba, Nicholas J.; Smith, Kathryn E.L.; Bishop, James M.; Smith, Christopher G.

    2016-08-04

    Scientists from the United States Geological Survey, St. Petersburg Coastal and Marine Science Center, U.S. Geological Survey Pacific Coastal and Marine Science Center, and students from the University of Hawaii at Manoa collected sediment cores, sediment surface grab samples, ground-penetrating radar (GPR) and Differential Global Positioning System (DGPS) data from within the Edwin B. Forsythe National Wildlife Refuge–Holgate Unit located on the southern end of Long Beach Island, New Jersey, in April 2015 (FAN 2015-611-FA). The study’s objective was to identify washover deposits in the stratigraphic record to aid in understanding barrier island evolution. This report is an archive of GPR and DGPS data collected from Long Beach Island in 2015. Data products, including raw GPR and processed DGPS data, elevation corrected GPR profiles, and accompanying Federal Geographic Data Committee metadata can be downloaded from the Data Downloads page.

  7. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  8. Next Generation Data Collection System for Mobile Detection and Discrimination of Unexploded Ordnance

    Science.gov (United States)

    2008-09-10

    131211 tLtLtL =≥ and aluminium items with cylindrical symmetry have ( ) ( ) ( )131211 tLtLtL ≥= ; 4) Time-decay characteristics of the...post-process the parallel data streams. The SPAN system time stamping is accurate to the 10s of nanoseconds, and the DAS is accurate to 10...from the SPAN disciplines the clock in the DAS which allows the arrival of the SCS930 to be time- stamped with an accuracy of 10 μs (actually plus or

  9. Semi-automatic determination of dips and depths of geologic contacts from magnetic data with application to the Turi Fault System, Taranaki Basin, New Zealand

    Science.gov (United States)

    Caratori Tontini, Fabio; Blakely, Richard J.; Stagpoole, Vaughan; Seebeck, Hannu

    2018-03-01

    We show a simple and fast method for calculating geometric parameters of magnetic contacts from spatial gradients of magnetic field data. The method is based on well-established properties of the tangent of the tilt-angle of reduced-to-the-pole magnetic data, and extends the performance of existing methods by allowing direct estimation of depths, locations and dips of magnetic contacts. It uses a semi-automatic approach where the user interactively specifies points on magnetic maps where the calculation is to be performed. Some prior geologic knowledge and visual interpretation of magnetic anomalies is required to choose proper calculation points. We successfully tested the method on synthetic models of contacts at different depths and with different dip angles. We offer an example of the method applied to airborne magnetic data from Taranaki Basin located offshore the North Island of New Zealand.

  10. Making sense of the shadows: priorities for creating a learning healthcare system based on routinely collected data

    Science.gov (United States)

    Deeny, Sarah R; Steventon, Adam

    2015-01-01

    Socrates described a group of people chained up inside a cave, who mistook shadows of objects on a wall for reality. This allegory comes to mind when considering ‘routinely collected data’—the massive data sets, generated as part of the routine operation of the modern healthcare service. There is keen interest in routine data and the seemingly comprehensive view of healthcare they offer, and we outline a number of examples in which they were used successfully, including the Birmingham OwnHealth study, in which routine data were used with matched control groups to assess the effect of telephone health coaching on hospital utilisation. Routine data differ from data collected primarily for the purposes of research, and this means that analysts cannot assume that they provide the full or accurate clinical picture, let alone a full description of the health of the population. We show that major methodological challenges in using routine data arise from the difficulty of understanding the gap between patient and their ‘data shadow’. Strategies to overcome this challenge include more extensive data linkage, developing analytical methods and collecting more data on a routine basis, including from the patient while away from the clinic. In addition, creating a learning health system will require greater alignment between the analysis and the decisions that will be taken; between analysts and people interested in quality improvement; and between the analysis undertaken and public attitudes regarding appropriate use of data. PMID:26065466

  11. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    Science.gov (United States)

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  12. AUTOMATIC ROAD SIGN INVENTORY USING MOBILE MAPPING SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Soilán

    2016-06-01

    Full Text Available The periodic inspection of certain infrastructure features plays a key role for road network safety and preservation, and for developing optimal maintenance planning that minimize the life-cycle cost of the inspected features. Mobile Mapping Systems (MMS use laser scanner technology in order to collect dense and precise three-dimensional point clouds that gather both geometric and radiometric information of the road network. Furthermore, time-stamped RGB imagery that is synchronized with the MMS trajectory is also available. In this paper a methodology for the automatic detection and classification of road signs from point cloud and imagery data provided by a LYNX Mobile Mapper System is presented. First, road signs are detected in the point cloud. Subsequently, the inventory is enriched with geometrical and contextual data such as orientation or distance to the trajectory. Finally, semantic content is given to the detected road signs. As point cloud resolution is insufficient, RGB imagery is used projecting the 3D points in the corresponding images and analysing the RGB data within the bounding box defined by the projected points. The methodology was tested in urban and road environments in Spain, obtaining global recall results greater than 95%, and F-score greater than 90%. In this way, inventory data is obtained in a fast, reliable manner, and it can be applied to improve the maintenance planning of the road network, or to feed a Spatial Information System (SIS, thus, road sign information can be available to be used in a Smart City context.

  13. Modeling of a data exchange process in the Automatic Process Control System on the base of the universal SCADA-system

    Science.gov (United States)

    Topolskiy, D.; Topolskiy, N.; Solomin, E.; Topolskaya, I.

    2016-04-01

    In the present paper the authors discuss some ways of solving energy saving problems in mechanical engineering. In authors' opinion one of the ways of solving this problem is integrated modernization of power engineering objects of mechanical engineering companies, which should be intended for the energy supply control efficiency increase and electric energy commercial accounting improvement. The author have proposed the usage of digital current and voltage transformers for these purposes. To check the compliance of this equipment with the IEC 61850 International Standard, we have built a mathematic model of the data exchange process between measuring transformers and a universal SCADA-system. The results of modeling show that the discussed equipment corresponds to the mentioned Standard requirements and the usage of the universal SCADA-system for these purposes is preferable and economically reasonable. In modeling the authors have used the following software: MasterScada, Master OPC_DI_61850, OPNET.

  14. A new approach to configurable primary data collection.

    Science.gov (United States)

    Stanek, J; Babkin, E; Zubov, M

    2016-09-01

    The formats, semantics and operational rules of data processing tasks in genomics (and health in general) are highly divergent and can rapidly change. In such an environment, the problem of consistent transformation and loading of heterogeneous input data to various target repositories becomes a critical success factor. The objective of the project was to design a new conceptual approach to configurable data transformation, de-identification, and submission of health and genomic data sets. Main motivation was to facilitate automated or human-driven data uploading, as well as consolidation of heterogeneous sources in large genomic or health projects. Modern methods of on-demand specialization of generic software components were applied. For specification of input-output data and required data collection activities, we propose a simple data model of flat tables as well as a domain-oriented graphical interface and portable representation of transformations in XML. Using such methods, the prototype of the Configurable Data Collection System (CDCS) was implemented in Java programming language with Swing graphical interfaces. The core logic of transformations was implemented as a library of reusable plugins. The solution is implemented as a software prototype for a configurable service-oriented system for semi-automatic data collection, transformation, sanitization and safe uploading to heterogeneous data repositories-CDCS. To address the dynamic nature of data schemas and data collection processes, the CDCS prototype facilitates interactive, user-driven configuration of the data collection process and extends basic functionality with a wide range of third-party plugins. Notably, our solution also allows for the reduction of manual data entry for data originally missing in the output data sets. First experiments and feedback from domain experts confirm the prototype is flexible, configurable and extensible; runs well on data owner's systems; and is not dependent on

  15. A Machine Vision System for Automatically Grading Hardwood Lumber - (Industrial Metrology)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas T. Drayer; Philip A. Araman; Robert L. Brisbon

    1992-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  16. Microprocessor-controlled system for automatic acquisition of potentiometric data and their non-linear least-squares fit in equilibrium studies.

    Science.gov (United States)

    Gampp, H; Maeder, M; Zuberbühler, A D; Kaden, T A

    1980-06-01

    A microprocessor-controlled potentiometric titration apparatus for equilibrium studies is described. The microprocessor controls the stepwise addition of reagent, monitors the pH until it becomes constant and stores the constant value. The data are recorded on magnetic tape by a cassette recorder with an RS232 input-output interface. A non-linear least-squares program based on Marquardt's modification of the Newton-Gauss method is discussed and its performance in the calculation of equilibrium constants is exemplified. An HP 9821 desk-top computer accepts the data from the magnetic tape recorder. In addition to a fully automatic fitting procedure, the program allows manual adjustment of the parameters. Three examples are discussed with regard to performance and reproducibility.

  17. Development of automatic techniques for GPS data management

    International Nuclear Information System (INIS)

    Park, Pil Ho

    2001-06-01

    It is necessary for GPS center to establish automatization as effective management of GPS network including data gathering, data transformation, data backup, data sending to IGS (International GPS Service for geodynamics), and precise ephemerides gathering. The operating program of GPS center has been adopted at KCSC (Korea Cadastral Survey Corporation), NGI (National Geography Institute), MOMAF (Ministry of Maritime Affairs and Fisheries) without self-development of core technique. The automatic management of GPS network is consists of GPS data management and data processing. It is also fundamental technique, which should be accomplished by every GPS centers. Therefore, this study carried out analyzing of Japanese GPS center, which has accomplished automatization by module considering applicability for domestic GPS centers

  18. Big Data technology in traffic: A case study of automatic counters

    Directory of Open Access Journals (Sweden)

    Janković Slađana R.

    2016-01-01

    Full Text Available Modern information and communication technologies together with intelligent devices provide a continuous inflow of large amounts of data that are used by traffic and transport systems. Collecting traffic data does not represent a challenge nowadays, but the issues remains in relation to storing and processing increasing amounts of data. In this paper we have investigated the possibilities of using Big Data technology to store and process data in the transport domain. The term Big Data refers to a large volume of information resource, its velocity and variety, far beyond the capabilities of commonly used software for storing, processing and data management. In our case study, Apache™ Hadoop® Big Data was used for processing data collected from 10 automatic traffic counters set up in Novi Sad and its surroundings. Indicators of traffic load which were calculated using the Big Data platforms were presented using tables and graphs in Microsoft Office Excel tool. The visualization and geolocation of the obtained indicators were performed using the Microsoft Business Intelligence (BI tools such as: Excel Power View and Excel Power Map. This case study showed that Big Data technologies combined with the BI tools can be used as a reliable support in monitoring of the traffic management systems.

  19. Automatic Door Access System Using Face Recognition

    Directory of Open Access Journals (Sweden)

    Hteik Htar Lwin

    2015-06-01

    Full Text Available Abstract Most doors are controlled by persons with the use of keys security cards password or pattern to open the door. Theaim of this paper is to help users forimprovement of the door security of sensitive locations by using face detection and recognition. Face is a complex multidimensional structure and needs good computing techniques for detection and recognition. This paper is comprised mainly of three subsystems namely face detection face recognition and automatic door access control. Face detection is the process of detecting the region of face in an image. The face is detected by using the viola jones method and face recognition is implemented by using the Principal Component Analysis PCA. Face Recognition based on PCA is generally referred to as the use of Eigenfaces.If a face is recognized it is known else it is unknown. The door will open automatically for the known person due to the command of the microcontroller. On the other hand alarm will ring for the unknown person. Since PCA reduces the dimensions of face images without losing important features facial images for many persons can be stored in the database. Although many training images are used computational efficiency cannot be decreased significantly. Therefore face recognition using PCA can be more useful for door security system than other face recognition schemes.

  20. (a,k)-Anonymous Scheme for Privacy-Preserving Data Collection in IoT-based Healthcare Services Systems.

    Science.gov (United States)

    Li, Hongtao; Guo, Feng; Zhang, Wenyin; Wang, Jie; Xing, Jinsheng

    2018-02-14

    The widely use of IoT technologies in healthcare services has pushed forward medical intelligence level of services. However, it also brings potential privacy threat to the data collection. In healthcare services system, health and medical data that contains privacy information are often transmitted among networks, and such privacy information should be protected. Therefore, there is a need for privacy-preserving data collection (PPDC) scheme to protect clients (patients) data. We adopt (a,k)-anonymity model as privacy pretection scheme for data collection, and propose a novel anonymity-based PPDC method for healthcare services in this paper. The threat model is analyzed in the client-server-to-user (CS2U) model. On client-side, we utilize (a,k)-anonymity notion to generate anonymous tuples which can resist possible attack, and adopt a bottom-up clustering method to create clusters that satisfy a base privacy level of (a 1 ,k 1 )-anonymity. On server-side, we reduce the communication cost through generalization technology, and compress (a 1 ,k 1 )-anonymous data through an UPGMA-based cluster combination method to make the data meet the deeper level of privacy (a 2 ,k 2 )-anonymity (a 1  ≥ a 2 , k 2  ≥ k 1 ). Theoretical analysis and experimental results prove that our scheme is effective in privacy-preserving and data quality.

  1. An automatic injection system for rapid radiochemistry

    International Nuclear Information System (INIS)

    Nurmia, M.J.; Kreek, S.A.; Kadkhodayan, B.; Gregorich, K.E.; Lee, D.M.; Hoffman, D.C.

    1992-01-01

    A description is given of the Automated Injection System (AIS), a pneumatically actuated device for automated collection of nuclear reaction products from a He/KCl gas jet transport system. The AIS is used with the Automated Chemical Chromatographic Element Separation System; together these two devices facilitate completely automated separation procedures with improved speed and reproducibility

  2. Automatic data acquisition of anthropological measurements

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O

    1993-01-01

    A computer program in BASIC is presented which enables the input of measurement data from a caliper directly into specific records in a dBASE IV or PARADOX database. The program circumvents the tedious procedure of first recording measurement data manually and then entering the data into a comput...

  3. The TS 600: automatic control system for eddy currents

    International Nuclear Information System (INIS)

    Poulet, J.P.

    1986-10-01

    In the scope of fabrication and in service inspection of the PWR steam generator tubing bendle, FRAMATOME developed an automatic Eddy Current testing system: TS600. Based on a mini-computer, TS600 allows to digitize, to store and to process data in various ways, so it is possible to perform several kinds of inspection: conventional inservice inspection, roll area profilometry...... TS600 can also be used to develop new methods of examination [fr

  4. Human-system Interfaces for Automatic Systems

    Energy Technology Data Exchange (ETDEWEB)

    OHara, J.M.; Higgins,J. (BNL); Fleger, S.; Barnes V. (NRC)

    2010-11-07

    Automation is ubiquitous in modern complex systems, and commercial nuclear- power plants are no exception. Automation is applied to a wide range of functions including monitoring and detection, situation assessment, response planning, and response implementation. Automation has become a 'team player' supporting personnel in nearly all aspects of system operation. In light of its increasing use and importance in new- and future-plants, guidance is needed to conduct safety reviews of the operator's interface with automation. The objective of this research was to develop such guidance. We first characterized the important HFE aspects of automation, including six dimensions: levels, functions, processes, modes, flexibility, and reliability. Next, we reviewed literature on the effects of all of these aspects of automation on human performance, and on the design of human-system interfaces (HSIs). Then, we used this technical basis established from the literature to identify general principles for human-automation interaction and to develop review guidelines. The guidelines consist of the following seven topics: automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, our study identified several topics for additional research.

  5. Human-system Interfaces for Automatic Systems

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Higgins, J.; Fleger, S.; Barnes, V.

    2010-01-01

    Automation is ubiquitous in modern complex systems, and commercial nuclear- power plants are no exception. Automation is applied to a wide range of functions including monitoring and detection, situation assessment, response planning, and response implementation. Automation has become a 'team player' supporting personnel in nearly all aspects of system operation. In light of its increasing use and importance in new- and future-plants, guidance is needed to conduct safety reviews of the operator's interface with automation. The objective of this research was to develop such guidance. We first characterized the important HFE aspects of automation, including six dimensions: levels, functions, processes, modes, flexibility, and reliability. Next, we reviewed literature on the effects of all of these aspects of automation on human performance, and on the design of human-system interfaces (HSIs). Then, we used this technical basis established from the literature to identify general principles for human-automation interaction and to develop review guidelines. The guidelines consist of the following seven topics: automation displays, interaction and control, automation modes, automation levels, adaptive automation, error tolerance and failure management, and HSI integration. In addition, our study identified several topics for additional research.

  6. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  7. Feasibility and desirability study of implementing a duty-cycle data-collection system as part of NPRDS

    International Nuclear Information System (INIS)

    Van Howe, K.R.; Koppe, R.H.; Voegtle, R.B.; Kline, S.C.; Olson, E.A.J.

    The objective of this project was to investigate cost-effective ways to improve the Nuclear Plant Reliability Data System (NPRDS) failure statistics as they are affected by the component-usage question. The nominal way to improve these statistics is to record, and to some extent describe, each operation of a safety component as well as each failure. Failures per demand or failures per operating hour could then be determined. It was recognized that such component-usage data collection for even a small number of key components could be costly or otherwise impractical. The reporting requirements for nuclear units are already burdensome and frequently redundant or overlapping. Therefore, the desirability of these information retrieval plans, in terms of the expected burdens and benefits, was also a consideration to be addressed, in addition to the feasibility of actually implementing such data collection

  8. National Collegiate Athletic Association Injury Surveillance System: Review of Methods for 2004–2005 Through 2013–2014 Data Collection

    Science.gov (United States)

    Kerr, Zachary Y.; Dompier, Thomas P.; Snook, Erin M.; Marshall, Stephen W.; Klossner, David; Hainline, Brian; Corlette, Jill

    2014-01-01

    Background: Since 1982, the National Collegiate Athletic Association has used the Injury Surveillance System (ISS) to collect injury and athlete-exposure data from a representative sample of collegiate institutions and sports. At the start of the 2004–2005 academic year, a Web-based ISS replaced the paper-based platform previously used for reporting injuries and exposures. Objective: To describe the methods of the Web-based National Collegiate Athletic Association ISS for data collection as implemented from the 2004–2005 to 2013–2014 academic years. Description: The Web-based ISS monitored National Collegiate Athletic Association–sanctioned practices and competitions, the number of participating student–athletes, and time-loss injuries during the preseason, regular season, and postseason in 25 collegiate sports. Starting in the 2009–2010 academic year, non–time-loss injuries were also tracked. Efforts were made to better integrate ISS data collection into the workflow of collegiate athletic trainers. Data for the 2004–2005 to 2013–2014 academic years are available to researchers through a standardized application process available at the Datalys Center Web site. Conclusions: As of February 2014, more than 1 dozen data sets have been provided to researchers. The Datalys Center encourages applications for access to the data. PMID:24870292

  9. National collegiate athletic association injury surveillance system: review of methods for 2004-2005 through 2013-2014 data collection.

    Science.gov (United States)

    Kerr, Zachary Y; Dompier, Thomas P; Snook, Erin M; Marshall, Stephen W; Klossner, David; Hainline, Brian; Corlette, Jill

    2014-01-01

    Since 1982, the National Collegiate Athletic Association has used the Injury Surveillance System (ISS) to collect injury and athlete-exposure data from a representative sample of collegiate institutions and sports. At the start of the 2004-2005 academic year, a Web-based ISS replaced the paper-based platform previously used for reporting injuries and exposures. To describe the methods of the Web-based National Collegiate Athletic Association ISS for data collection as implemented from the 2004-2005 to 2013-2014 academic years. The Web-based ISS monitored National Collegiate Athletic Association-sanctioned practices and competitions, the number of participating student-athletes, and time-loss injuries during the preseason, regular season, and postseason in 25 collegiate sports. Starting in the 2009-2010 academic year, non-time-loss injuries were also tracked. Efforts were made to better integrate ISS data collection into the workflow of collegiate athletic trainers. Data for the 2004-2005 to 2013-2014 academic years are available to researchers through a standardized application process available at the Datalys Center Web site. As of February 2014, more than 1 dozen data sets have been provided to researchers. The Datalys Center encourages applications for access to the data.

  10. A computer-assisted data collection system for use in a multicenter study of American Indians and Alaska Natives: SCAPES.

    Science.gov (United States)

    Edwards, Roger L; Edwards, Sandra L; Bryner, James; Cunningham, Kelly; Rogers, Amy; Slattery, Martha L

    2008-04-01

    We describe a computer-assisted data collection system developed for a multicenter cohort study of American Indian and Alaska Native people. The study computer-assisted participant evaluation system or SCAPES is built around a central database server that controls a small private network with touch screen workstations. SCAPES encompasses the self-administered questionnaires, the keyboard-based stations for interviewer-administered questionnaires, a system for inputting medical measurements, and administrative tasks such as data exporting, backup and management. Elements of SCAPES hardware/network design, data storage, programming language, software choices, questionnaire programming including the programming of questionnaires administered using audio computer-assisted self-interviewing (ACASI), and participant identification/data security system are presented. Unique features of SCAPES are that data are promptly made available to participants in the form of health feedback; data can be quickly summarized for tribes for health monitoring and planning at the community level; and data are available to study investigators for analyses and scientific evaluation.

  11. A computer-assisted data collection system for use in a multicenter study of American Indians and Alaska Natives: SCAPES

    Science.gov (United States)

    Edwards, Roger L; Bryner, James; Cunningham, Kelly; Rogers, Amy; Slattery, Martha L.

    2008-01-01

    We describe a computer-assisted data collection system developed for a multicenter cohort study of American Indian and Alaska Natives. The Study Computer-Assisted Participant Evaluation System or SCAPES is built around a central database server that controls a small private network with touch screen workstations. SCAPES encompasses the self-administered questionnaires, the keyboard-based stations for interviewer-administered questionnaires, a system for inputting medical measurements, and administrative tasks such as data exporting, backup and management. Elements of SCAPES hardware/network design, data storage, programming language, software choices, questionnaire programming including the programming of questionnaires administered using audio computer-assisted self interviewing (ACASI), and participant identification/data security system are presented. Unique features of SCAPES are that data are promptly made available to participants in the form of health feedback; data can be quickly summarized for tribes for health monitoring and planning at the community level; and data are available to study investigators for analyses and scientific evaluation. PMID:18207604

  12. An automatic monitoring system of leak current for testing TGC detectors based on LabVIEW

    International Nuclear Information System (INIS)

    Feng Cunfeng; Lu Taiguo; Yan Zhen; Wang Suojie; Zhu Chengguang; Sun Yansheng; He Mao

    2005-01-01

    An automatic monitoring system of leak current for testing TGC detectors with high voltage was set up by using the graphic LabVIEW platform and NI 4351 data acquisition card. The leak current was automatically monitored and recorded with this system, the time and the value of the leak current were showed instantly. Good efficiency and precision of monitoring were obtained. (authors)

  13. An overview of future EU health systems. An insight into governance, primary care, data collection and citizens' participation.

    Science.gov (United States)

    Quaglio, Gianluca; Figueras, Josep; Mantoan, Domenico; Dawood, Amr; Karapiperis, Theodoros; Costongs, Caroline; Bernal-Delgado, Enrique

    2018-03-26

    Health systems in the European Union (EU) are being questioned over their effectiveness and sustainability. In pursuing both goals, they have to conciliate coexisting, not always aligned, realities. This paper originated from a workshop entitled 'Health systems for the future' held at the European Parliament. Experts and decision makers were asked to discuss measures that may increase the effectiveness and sustainability of health systems, namely: (i) increasing citizens' participation; (ii) the importance of primary care in providing integrated services; (iii) improving the governance and (iv) fostering better data collection and information channels to support the decision making process. In the parliamentary debate, was discussed the concept that, in the near future, health systems' effectiveness and sustainability will very much depend on effective access to integrated services where primary care is pivotal, a clearer shift from care-oriented systems to health promotion and prevention, a profound commitment to good governance, particularly to stakeholders participation, and a systematic reuse of data meant to build health data-driven learning systems. Many health issues, such as future health systems in the EU, are potentially transformative and hence an intense political issue. It is policy-making leadership that will mostly determine how well EU health systems are prepared to face future challenges.

  14. Using a participatory evaluation design to create an online data collection and monitoring system for New Mexico's Community Health Councils.

    Science.gov (United States)

    Andrews, M L; Sánchez, V; Carrillo, C; Allen-Ananins, B; Cruz, Y B

    2014-02-01

    We present the collaborative development of a web-based data collection and monitoring plan for thirty-two county councils within New Mexico's health council system. The monitoring plan, a key component in our multiyear participatory statewide evaluation process, was co-developed with the end users: representatives of the health councils. Guided by the Institute of Medicine's Community, Health Improvement Process framework, we first developed a logic model that delineated processes and intermediate systems-level outcomes in council development, planning, and community action. Through the online system, health councils reported data on intermediate outcomes, including policy changes and funds leveraged. The system captured data that were common across the health council system, yet was also flexible so that councils could report their unique accomplishments at the county level. A main benefit of the online system was that it provided the ability to assess intermediate, outcomes across the health council system. Developing the system was not without challenges, including creating processes to ensure participation across a large rural state; creating shared understanding of intermediate outcomes and indicators; and overcoming technological issues. Even through the challenges, however, the benefits of committing to using participatory processes far outweighed the challenges. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. 2013 International Conference on Mechatronics and Automatic Control Systems

    CERN Document Server

    2014-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the 2013 International Conference on Mechatronics and Automatic Control Systems held in Hangzhou, China on August 10-11, 2013. .

  16. System for Automatic Generation of Examination Papers in Discrete Mathematics

    Science.gov (United States)

    Fridenfalk, Mikael

    2013-01-01

    A system was developed for automatic generation of problems and solutions for examinations in a university distance course in discrete mathematics and tested in a pilot experiment involving 200 students. Considering the success of such systems in the past, particularly including automatic assessment, it should not take long before such systems are…

  17. Collective Analysis of Qualitative Data

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Friberg, Karin

    2014-01-01

    What. Many students and practitioners do not know how to systematically process qualitative data once it is gathered—at least not as a collective effort. This chapter presents two workshop techniques, affinity diagramming and diagnostic mapping, that support collective analysis of large amounts...... of qualitative data. Affinity diagramming is used to make collective analysis and interpretations of qualitative data to identify core problems that need to be addressed in the design process. Diagnostic mapping supports collective interpretation and description of these problems and how to intervene in them. We....... In particular, collective analysis can be used to identify, understand, and act on complex design problems that emerge, for example, after the introduction of new tech- nologies. Such problems might be hard to clarify, and the basis for the analysis often involves large amounts of unstructured qualitative data...

  18. Data Collection and New Technology

    OpenAIRE

    Olubunmi Philip Aborisade

    2013-01-01

    Interview has become a popular method of data collection in qualitative research. This article examines the different interview methods for collecting data (e.g., structured interviews, group interviews, unstructured, etc.), as well as the various methods for analyzing interview data (e.g., interpretivism, social anthropology, collaborative social research). It also evaluates the interview types and analysis methods in qualitative research and the new technology for conducting interviews such...

  19. Procurement procedures and specifications for performance measure capable traffic infrastructure data collection systems.

    Science.gov (United States)

    2012-01-01

    Traffic signal systems represent a substantial component of the highway transportation network in the United : States. It is challenging for most agencies to find engineering resources to properly update signal policies and : timing plans to accommod...

  20. Procurement procedures and specifications for performance measure capable traffic infrastructure data collection systems : [technical summary].

    Science.gov (United States)

    2011-01-01

    Traffic signal systems represent a substantial component of the highway transportation network in the United States. Unfortunately, most agencies struggle to meet the challenge of finding enough resources to properly update signal policies and timing...

  1. Collection of Operating and Support Data from Weapon System Support Contracts

    Science.gov (United States)

    2008-08-01

    Major Associate Contractors or Subcontractors: General Electric (engines) Honeywell (auxiliary power unit) Smiths (stores management system...Aircraft Industries Moto Guzzi (engine) Scope of Effort: Northrop Grumman provides comprehensive contractor logistics support for the Hunter UAV, which

  2. Travel time data collection for measurement of advanced traveler information systems accuracy

    Science.gov (United States)

    2003-06-01

    The objective of this white paper is to recommend an approach to measuring ATIS travel time accuracy so that ITS planners might have the data they need to make cost effective decisions regarding deployment of surveillance technologies to support ATIS...

  3. Data collection system. Volume 1, Overview and operators manual; Volume 2, Maintenance manual; Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, R.B.; Bauder, M.E.; Boyer, W.B.; French, R.E.; Isidoro, R.J.; Kaestner, P.C.; Perkins, W.G.

    1993-09-01

    Sandia National Laboratories (SNL) Instrumentation Development Department was tasked by the Defense Nuclear Agency (DNA) to record data on Tektronix RTD720 Digitizers on the HUNTERS TROPHY field test conducted at the Nevada Test Site (NTS) on September 18, 1992. This report contains a overview and description of the computer hardware and software that was used to acquire, reduce, and display the data. The document is divided into two volumes: an overview and operators manual (Volume 1) and a maintenance manual (Volume 2).

  4. Automatic fiber Bragg grating fabrication system for mass production

    Science.gov (United States)

    Wang, Yunmiao; Gong, Jianmin; Dong, Bo; Wang, Dorothy Y.; Wang, Anbo

    2011-06-01

    The large multiplexing number of FBGs exposes a requirement for the effective and repeatable fabrication method. In this paper we report the development of an automatic FBG fabrication system, which meets the requirement of mass production. There are four major functional parts in the system: fiber feeding system, CO2 laser coating removal system, FBG writing system and fiber collecting system. The fiber feeding system uses motors and gears to accurately move an optical fiber to where the FBGs will be made. The coating removal system is based on the heat effect of a CO2 laser, which will decompose and evaporate the selected coating of the optical fiber. The FBG writing system is based on the UV photosensitivity of the fiber. A phase-mask is placed between the UV light and the optical fiber to produce periodic interference pattern, which further modulates the refractive index along the fiber periodically. The fiber collecting system is driven by a linear motor and the fiber can be wound around a spool tightly and smoothly at a moderate speed. The whole FBG fabrication system is controlled and synchronized by a computer via some interface circuits and a Graphical User Interface (GUI). With this system, it takes 48 seconds to fabricate one FBG, and up to 500 FBGs can be made continuously, which is limited by the leakage of the gas inside the excimer laser. This mass production line not only improves the fabrication efficiency but also contributes to the multiplexing capability by reducing the splicing loss.

  5. Automatic Battery Swap System for Home Robots

    Directory of Open Access Journals (Sweden)

    Juan Wu

    2012-12-01

    Full Text Available This paper presents the design and implementation of an automatic battery swap system for the prolonged activities of home robots. A battery swap station is proposed to implement battery off-line recharging and on-line exchanging functions. It consists of a loading and unloading mechanism, a shifting mechanism, a locking device and a shell. The home robot is a palm-sized wheeled robot with an onboard camera and a removable battery case in the front. It communicates with the battery swap station wirelessly through ZigBee. The influences of battery case deflection and robot docking deflection on the battery swap operations have been investigated. The experimental results show that it takes an average time of 84.2s to complete the battery swap operations. The home robot does not have to wait several hours for the batteries to be fully charged. The proposed battery swap system is proved to be efficient in home robot applications that need the robots to work continuously over a long period.

  6. Ground penetrating radar and differential global positioning system data collected in April 2016 from Fire Island, New York

    Science.gov (United States)

    Forde, Arnell S.; Bernier, Julie C.; Miselis, Jennifer L.

    2018-02-22

    Researchers from the U.S. Geological Survey (USGS) conducted a long-term coastal morphologic-change study at Fire Island, New York, prior to and after Hurricane Sandy impacted the area in October 2012. The Fire Island Coastal Change project objectives include understanding the morphologic evolution of the barrier island system on a variety of time scales (months to centuries) and resolving storm-related impacts, post-storm beach response, and recovery. In April 2016, scientists from the USGS St. Petersburg Coastal and Marine Science Center conducted geophysical and sediment sampling surveys on Fire Island to characterize and quantify spatial variability in the subaerial geology with the goal of subsequently integrating onshore geology with other surf zone and nearshore datasets.  This report, along with the associated USGS data release, serves as an archive of ground penetrating radar (GPR) and post-processed differential global positioning system (DGPS) data collected from beach and back-barrier environments on Fire Island, April 6–13, 2016 (USGS Field Activity Number 2016-322-FA). Data products, including unprocessed GPR trace data, processed DGPS data, elevation-corrected subsurface profile images, geographic information system files, and accompanying Federal Geographic Data Committee metadata are available for download.

  7. Distributed privacy preserving data collection

    KAUST Repository

    Xue, Mingqiang

    2011-01-01

    We study the distributed privacy preserving data collection problem: an untrusted data collector (e.g., a medical research institute) wishes to collect data (e.g., medical records) from a group of respondents (e.g., patients). Each respondent owns a multi-attributed record which contains both non-sensitive (e.g., quasi-identifiers) and sensitive information (e.g., a particular disease), and submits it to the data collector. Assuming T is the table formed by all the respondent data records, we say that the data collection process is privacy preserving if it allows the data collector to obtain a k-anonymized or l-diversified version of T without revealing the original records to the adversary. We propose a distributed data collection protocol that outputs an anonymized table by generalization of quasi-identifier attributes. The protocol employs cryptographic techniques such as homomorphic encryption, private information retrieval and secure multiparty computation to ensure the privacy goal in the process of data collection. Meanwhile, the protocol is designed to leak limited but non-critical information to achieve practicability and efficiency. Experiments show that the utility of the anonymized table derived by our protocol is in par with the utility achieved by traditional anonymization techniques. © 2011 Springer-Verlag.

  8. EPA Linked Open Data (Collection)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This is a collection item referencing the following EPA Linked Data resources: - EPA Facility Registry Service (FRS) - EPA Substance Registry Service (SRS) -...

  9. Water Column Sonar Data Collection

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The collection and analysis of water column sonar data is a relatively new avenue of research into the marine environment. Primary uses include assessing biological...

  10. An automated digital data collection and analysis system for the Charpy Impact Tester

    Science.gov (United States)

    Kohne, Glenn S.; Spiegel, F. Xavier

    1994-04-01

    The standard Charpy Impact Tester has been modified by the addition of a system of hardware and software to improve the accuracy and consistency of measurements made during specimen fracturing experiments. An optical disc, light source, and detector generate signals that indicate the pendulum position as a function of time. These signals are used by a computer to calculate the velocity and kinetic energy of the pendulum as a function of its position.

  11. Processing IMS data automatically: A case study of the Chelyabinsk bolide

    Science.gov (United States)

    Arrowsmith, S.; Marcillo, O. E.; Blom, P. S.; Whitaker, R. W.; Randall, G. E.

    2013-12-01

    We present automatic algorithms for detection, association, and location of infrasound events using the International Monitoring System (IMS) infrasound network. Each algorithm is based on probabilistic considerations that formally account for uncertainties at both the station and network levels. Our method is applied to two days of data that include infrasound signals from the Chelyabinsk bolide. We summarize the automatic detections, global association and localization of the bolide and discuss steps we are taking to improve the methodology based on these results.

  12. Automatic control of biomass gasifiers using fuzzy inference systems

    Energy Technology Data Exchange (ETDEWEB)

    Sagues, C. [Universidad de Zaragoza (Spain). Dpto. de Informatica e Ingenieria de Sistemas; Garcia-Bacaicoa, P.; Serrano, S. [Universidad de Zaragoza (Spain). Dpto. de Ingenieria Quimica y Medio Ambiente

    2007-03-15

    A fuzzy controller for biomass gasifiers is proposed. Although fuzzy inference systems do not need models to be tuned, a plant model is proposed which has turned out very useful to prove different combinations of membership functions and rules in the proposed fuzzy control. The global control scheme is shown, including the elements to generate the set points for the process variables automatically. There, the type of biomass and its moisture content are the only data which need to be introduced to the controller by a human operator at the beginning of operation to make it work autonomously. The advantages and good performance of the fuzzy controller with the automatic generation of set points, compared to controllers utilising fixed parameters, are demonstrated. (author)

  13. Automation of plasma-process fultext bibliography databases. An on-line data-collection, data-mining and data-input system

    International Nuclear Information System (INIS)

    Suzuki, Manabu; Pichl, Lukas; Murakami, Izumi; Kato, Takako; Sasaki, Akira

    2006-01-01

    Searching for relevant data, information retrieval, data extraction and data input are time- and resource-consuming activities in most data centers. Here we develop a Linux system automating the process in case of bibliography, abstract and fulltext databases. The present system is an open-source free-software low-cost solution that connects the target and provider databases in cyberspace through various web publishing formats. The abstract/fulltext relevance assessment is interfaced to external software modules. (author)

  14. B-1 Systems Approach to Training. Volume 3. Appendix B. Bibliography and Data Collection Trips

    Science.gov (United States)

    1975-07-01

    Feasibility of Using an Adaptive Technique in Flight Simulator Training, Ergonomics, 1971, 14, 3, 381-390. Bllli, W.H.B., and Allan , R.M., Pilot’s...Report FAA-ADS-61. January 1966. (AD 653 729). y Poe , A.C., and Lyon, V.W., The Effectiveness of the Cycloramic Link Trainer in the U.S. Naval...system. Boeing Company, Seattle, Washington. Report D229-10346-1, Preliminary. Shriver, Edgar L.; Determining training requirements for

  15. Automatic Extraction of Road Markings from Mobile Laser Scanning Data

    Science.gov (United States)

    Ma, H.; Pei, Z.; Wei, Z.; Zhong, R.

    2017-09-01

    Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS) and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS) system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.

  16. AUTOMATIC EXTRACTION OF ROAD MARKINGS FROM MOBILE LASER SCANNING DATA

    Directory of Open Access Journals (Sweden)

    H. Ma

    2017-09-01

    Full Text Available Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.

  17. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  18. Software for the high-throughput collection of SAXS data using an enhanced Blu-Ice/DCS control system

    International Nuclear Information System (INIS)

    Classen, Scott; Rodic, Ivan; Holton, James; Hura, Greg L.; Hammel, Michal; Tainer, John A.

    2010-01-01

    The Blu-Ice GUI and Distributed Control System (DCS) developed in the Macromolecular Crystallography Group at the Stanford Synchrotron Radiation Laboratory has been optimized, extended and enhanced to suit the specific needs of the SAXS endstation at the SIBYLS beamline at the Advanced Light Source. The customizations reported here provide one potential route for other SAXS beamlines in need of robust and efficient beamline control software. Biological small-angle X-ray scattering (SAXS) provides powerful complementary data for macromolecular crystallography (MX) by defining shape, conformation and assembly in solution. Although SAXS is in principle the highest throughput technique for structural biology, data collection is limited in practice by current data collection software. Here the adaption of beamline control software, historically developed for MX beamlines, for the efficient operation and high-throughput data collection at synchrotron SAXS beamlines is reported. The Blu-Ice GUI and Distributed Control System (DCS) developed in the Macromolecular Crystallography Group at the Stanford Synchrotron Radiation Laboratory has been optimized, extended and enhanced to suit the specific needs of the biological SAXS endstation at the SIBYLS beamline at the Advanced Light Source. The customizations reported here provide a potential route for other SAXS beamlines in need of robust and efficient beamline control software. As a great deal of effort and optimization has gone into crystallographic software, the adaption and extension of crystallographic software may prove to be a general strategy to provide advanced SAXS software for the synchrotron community. In this way effort can be put into optimizing features for SAXS rather than reproducing those that have already been successfully implemented for the crystallographic community

  19. The collection and interpretation of domestic accident data. A discussion on some aspects of the British consumer safety system.

    Science.gov (United States)

    Wilson, J R

    1979-06-01

    This article discusses the workings of a Consumer Safety System and identifies the problems of collecting and using domestic accident data. In the light of proposed changes in the civil law on product liability, particular consideration is given to the difficulties of providing evidence of consumer product involvement in domestic accidents. The paper is based on one read at the CBI Conference "Product Liability in Perspective", held at the Hilton Hotel, London, 30-31 March, 1977. The views expressed are those of the Institute for Consumer Ergonomics and are not necessarily those of the Department of Prices and Consumer Protection.

  20. The comparison of milk production and quality in cows from conventional and automatic milking systems

    Directory of Open Access Journals (Sweden)

    Renata Touov

    2014-12-01

    Full Text Available The objective of this study was to evaluate the effects of two different types of milking systems (conventional parlour vs. automatic milking system and the season of the year on the composition and hygienic quality of milk from Czech Fleckvieh cows. A total of 500 cows were involved; 200 and 300 in conventional and automatic milking systems, respectively. Bulk milk samples were collected for 12 months from July 2010 to June 2011. The following milk components and quality indicators were determined: % of fat, % of protein, % of lactose, % of fat-free dry matter (FFDM, % of casein, urea content, somatic cell count (SSC, total germ count (TGC and milk freezing point (FP. The data were processed and evaluated with MS Excel and the statistical software SAS 9.1. Significantly higher (P<0.05 0.01 contents of fat, protein, FFDM and casein and increased TGC were observed in the automatic milking system, whereas SCC and FP were significantly lower (P<0.01. The highest contents of fat, protein and casein, and the lowest lactose content were found in the winter season. The highest contents of FFDM, urea and SCC were observed in autumn, whereas TGC was highest in summer (P<0.05 0.01. Only FP was not influenced by the season.

  1. 34 CFR 303.540 - Data collection.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Data collection. 303.540 Section 303.540 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION AND... DISABILITIES State Administration Reporting Requirements § 303.540 Data collection. (a) Each system must...

  2. Global synthesis and critical evaluation of pharmaceutical data sets collected from river systems.

    Science.gov (United States)

    Hughes, Stephen R; Kay, Paul; Brown, Lee E

    2013-01-15

    Pharmaceuticals have emerged as a major group of environmental contaminants over the past decade but relatively little is known about their occurrence in freshwaters compared to other pollutants. We present a global-scale analysis of the presence of 203 pharmaceuticals across 41 countries and show that contamination is extensive due to widespread consumption and subsequent disposal to rivers. There are clear regional biases in current understanding with little work outside North America, Europe, and China, and no work within Africa. Within individual countries, research is biased around a small number of populated provinces/states and the majority of research effort has focused upon just 14 compounds. Most research has adopted sampling techniques that are unlikely to provide reliable and representative data. This analysis highlights locations where concentrations of antibiotics, cardiovascular drugs, painkillers, contrast media, and antiepileptic drugs have been recorded well above thresholds known to cause toxic effects in aquatic biota. Studies of pharmaceutical occurrence and effects need to be seen as a global research priority due to increasing consumption, particularly among societies with aging populations. Researchers in all fields of environmental management need to work together more effectively to identify high risk compounds, improve the reliability and coverage of future monitoring studies, and develop new mitigation measures.

  3. QuaDoSta - a freely configurable system which facilitates multi-centric data collection for healthcare and medical research

    Directory of Open Access Journals (Sweden)

    Albrecht, Ulrike

    2007-07-01

    Full Text Available This article describes QuaDoSta (quality assurance, documentation and statistics, a flexible documentation system as well as a data collection and networking platform for medical facilities. The user can freely define the required documentation masks which are easily expandable and can be adapted to individual requirements without the need for additional programming. To avoid duplication, data transfer interfaces can be configured flexibly to external sources such as patient management systems used in surgeries or hospital information systems. The projects EvaMed (Evaluation Anthroposophical Medicine and the Network Oncology are two scientific research projects which have been successfully established as nationally active networks on the basis of QuaDoSta. The EvaMed-Network serves as a modern pharmacovigilance project for the documentation of adverse drug events. All prescription data are electronically recorded to assess the relative risk of drugs. The Network Oncology was set up as a documentation system in four hospitals and seven specialist oncology practices where a complete record of all oncological therapies is being carried out to uniform standards on the basis of the ‘basic documentation for tumour patients’ (BDT developed by the German Cancer Society. The QuaDoSta solution system made it possible to cater for the specific requirements of the presented projects. The following features of the system proved to be highly advantageous: flexible setup of catalogues and user friendly customisation and extensions, complete dissociation of system setup and documentation content, multi-centre networkability, and configurable data transfer interfaces.

  4. Adding Data Accessibility and Rule-Based Targeted Data Collection to the California Cancer Reporting System for Breast Cases

    National Research Council Canada - National Science Library

    Gordon, Barry

    1998-01-01

    ...% of national cancer cases 1973-1993 (N 2 million) We also have completed our Clinical Trials Matching System, allowing the general public and healthcare professionals to obtain consumer-oriented summaries of current IRB-approved clinical trials...

  5. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  6. New insights into health financing: First results of the international data collection under the System of Health Accounts 2011 framework.

    Science.gov (United States)

    Mueller, Michael; Morgan, David

    2017-07-01

    International comparisons of health spending and financing are most frequently carried out using datasets of international organisations based on the System of Health Accounts (SHA). This accounting framework has recently been updated and 2016 saw the first international data collection under the new SHA 2011 guidelines. In addition to reaching better comparability of health spending figures and greater country coverage, the updated framework has seen changes in the dimension of health financing leading to important consequences when analysing health financing data. This article presents the first results of health spending and financing data collected under this new framework and highlights the areas where SHA 2011 has become a more useful tool for policy analysis, by complementing data on expenditure of health financing schemes with information about their revenue streams. It describes the major conceptual changes in the scope of health financing and highlights why comprehensive analyses based on SHA 2011 can provide for a more complete description and comparison of health financing across countries, facilitate a more meaningful discussion of fiscal sustainability of health spending by also analysing the revenues of compulsory public schemes and help to clarify the role of governments in financing health care - which is generally much bigger than previously documented. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Data Collection and New Technology

    Directory of Open Access Journals (Sweden)

    Olubunmi Philip Aborisade

    2013-05-01

    Full Text Available Interview has become a popular method of data collection in qualitative research. This article examines the different interview methods for collecting data (e.g., structured interviews, group interviews, unstructured, etc., as well as the various methods for analyzing interview data (e.g., interpretivism, social anthropology, collaborative social research. It also evaluates the interview types and analysis methods in qualitative research and the new technology for conducting interviews such as e-mail, telephone, skype, webcam, Facebook chat etc to ascertain how they limit interviewees from giving full picture of moral and ethical Issues.

  8. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Directory of Open Access Journals (Sweden)

    A. Bellakaout

    2016-06-01

    Full Text Available Aerial topographic surveys using Light Detection and Ranging (LiDAR technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS, mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  9. Automatic 3D Extraction of Buildings, Vegetation and Roads from LIDAR Data

    Science.gov (United States)

    Bellakaout, A.; Cherkaoui, M.; Ettarid, M.; Touzani, A.

    2016-06-01

    Aerial topographic surveys using Light Detection and Ranging (LiDAR) technology collect dense and accurate information from the surface or terrain; it is becoming one of the important tools in the geosciences for studying objects and earth surface. Classification of Lidar data for extracting ground, vegetation, and buildings is a very important step needed in numerous applications such as 3D city modelling, extraction of different derived data for geographical information systems (GIS), mapping, navigation, etc... Regardless of what the scan data will be used for, an automatic process is greatly required to handle the large amount of data collected because the manual process is time consuming and very expensive. This paper is presenting an approach for automatic classification of aerial Lidar data into five groups of items: buildings, trees, roads, linear object and soil using single return Lidar and processing the point cloud without generating DEM. Topological relationship and height variation analysis is adopted to segment, preliminary, the entire point cloud preliminarily into upper and lower contours, uniform and non-uniform surface, non-uniform surfaces, linear objects, and others. This primary classification is used on the one hand to know the upper and lower part of each building in an urban scene, needed to model buildings façades; and on the other hand to extract point cloud of uniform surfaces which contain roofs, roads and ground used in the second phase of classification. A second algorithm is developed to segment the uniform surface into buildings roofs, roads and ground, the second phase of classification based on the topological relationship and height variation analysis, The proposed approach has been tested using two areas : the first is a housing complex and the second is a primary school. The proposed approach led to successful classification results of buildings, vegetation and road classes.

  10. Design of a real-time tax-data monitoring intelligent card system

    Science.gov (United States)

    Gu, Yajun; Bi, Guotang; Chen, Liwei; Wang, Zhiyuan

    2009-07-01

    To solve the current problem of low efficiency of domestic Oil Station's information management, Oil Station's realtime tax data monitoring system has been developed to automatically access tax data of Oil pumping machines, realizing Oil-pumping machines' real-time automatic data collection, displaying and saving. The monitoring system uses the noncontact intelligent card or network to directly collect data which can not be artificially modified and so seals the loopholes and improves the tax collection's automatic level. It can perform real-time collection and management of the Oil Station information, and find the problem promptly, achieves the automatic management for the entire process covering Oil sales accounting and reporting. It can also perform remote query to the Oil Station's operation data. This system has broad application future and economic value.

  11. Automatic Method for Building Indoor Boundary Models from Dense Point Clouds Collected by Laser Scanners

    Directory of Open Access Journals (Sweden)

    Enrique Valero

    2012-11-01

    Full Text Available In this paper we present a method that automatically yields Boundary Representation Models (B-rep for indoors after processing dense point clouds collected by laser scanners from key locations through an existing facility. Our objective is particularly focused on providing single models which contain the shape, location and relationship of primitive structural elements of inhabited scenarios such as walls, ceilings and floors. We propose a discretization of the space in order to accurately segment the 3D data and generate complete B-rep models of indoors in which faces, edges and vertices are coherently connected. The approach has been tested in real scenarios with data coming from laser scanners yielding promising results. We have deeply evaluated the results by analyzing how reliably these elements can be detected and how accurately they are modeled.

  12. Automatic Method for Building Indoor Boundary Models from Dense Point Clouds Collected by Laser Scanners

    Science.gov (United States)

    Valero, Enrique; Adán, Antonio; Cerrada, Carlos

    2012-01-01

    In this paper we present a method that automatically yields Boundary Representation Models (B-rep) for indoors after processing dense point clouds collected by laser scanners from key locations through an existing facility. Our objective is particularly focused on providing single models which contain the shape, location and relationship of primitive structural elements of inhabited scenarios such as walls, ceilings and floors. We propose a discretization of the space in order to accurately segment the 3D data and generate complete B-rep models of indoors in which faces, edges and vertices are coherently connected. The approach has been tested in real scenarios with data coming from laser scanners yielding promising results. We have deeply evaluated the results by analyzing how reliably these elements can be detected and how accurately they are modeled. PMID:23443369

  13. Measuring Service Reliability Using Automatic Vehicle Location Data

    Directory of Open Access Journals (Sweden)

    Zhenliang Ma

    2014-01-01

    Full Text Available Bus service reliability has become a major concern for both operators and passengers. Buffer time measures are believed to be appropriate to approximate passengers' experienced reliability in the context of departure planning. Two issues with regard to buffer time estimation are addressed, namely, performance disaggregation and capturing passengers’ perspectives on reliability. A Gaussian mixture models based method is applied to disaggregate the performance data. Based on the mixture models distribution, a reliability buffer time (RBT measure is proposed from passengers’ perspective. A set of expected reliability buffer time measures is developed for operators by using different spatial-temporal levels combinations of RBTs. The average and the latest trip duration measures are proposed for passengers that can be used to choose a service mode and determine the departure time. Using empirical data from the automatic vehicle location system in Brisbane, Australia, the existence of mixture service states is verified and the advantage of mixture distribution model in fitting travel time profile is demonstrated. Numerical experiments validate that the proposed reliability measure is capable of quantifying service reliability consistently, while the conventional ones may provide inconsistent results. Potential applications for operators and passengers are also illustrated, including reliability improvement and trip planning.

  14. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification.

    Science.gov (United States)

    1978-12-01

    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  15. Research on Automatic Positioning System of Ultrasonic Testing of Wind Turbine Blade Flaws

    Science.gov (United States)

    Liu, Q. X.; Wang, Z. H.; Long, S. G.; Cai, M.; Cai, M.; Wang, X.; Chen, X. Y.; Bu, J. L.

    2017-11-01

    Ultrasonic testing technology has been used essentially in non-destructive testing of wind turbine blades. However, it is fact that the ultrasonic flaw detection method has inefficiently employed in recent years. This is because the testing result will illustrate a small deviation due to the artificial, environmental and technical factors. Therefore, it is an urgent technical demand for engineers to test the various flaws efficiently and quickly. An automatic positioning system has been designed in this paper to record the moving coordinates and the target distance in real time. Simultaneously, it could launch and acquire the sonic wave automatically. The ADNS-3080 optoelectronic chip is manufactured by Agilent Technologies Inc, which is also utilized in the system. With the combination of the chip, the power conversion module and the USB transmission module, the collected data can be transmitted from the upper monitor to the hardware that could process and control the data through software programming. An experiment has been designed to prove the reliability of automotive positioning system. The result has been validated by comparing the result collected form LABVIEW and actual plots on Perspex plane, it concludes that the system possesses high accuracy and magnificent meanings in practical engineering.

  16. Automatic Registration and Mosaicking System for Remotely Sensed Imagery

    Directory of Open Access Journals (Sweden)

    Emiliano Castejon

    2006-04-01

    Full Text Available Image registration is an important operation in many remote sensing applications and it, besides other tasks, involves the identification of corresponding control points in the images. As manual identification of control points may be time-consuming and tiring, several automatic techniques have been developed. This paper describes a system for automatic registration and mosaic of remote sensing images under development at The National Institute for Space Research (INPE and at The University of California, Santa Barbara (UCSB. The user can provide information to the system in order to speed up the registration process as well as to avoid mismatched control points. Based on statistical procedure, the system gives an indication of the registration quality. This allows users to stop the processing, to modify the registration parameters or to continue the processing. Extensive system tests have been performed with different types of data (optical, radar, multi-sensor, high-resolution images and video sequences in order to check the system performance. An online demo system is available on the internet ( which contains several examples that can be carried out using web browser.

  17. Automatic early warning systems for the environment

    Directory of Open Access Journals (Sweden)

    Lesjak Martin

    2003-01-01

    Full Text Available Computerized, continuous monitoring environmental early warning systems are complex networks that merge measurements with the information technology. Accuracy, consistency, reliability and data quality are their most important features. Several effects may disturb their characteristics: hostile environment, unreliable communications, poor quality of equipment nonqualified users or service personnel. According to our experiences, a number of measures should be taken to enhance system performances and to maintain them at the desired level. In the paper, we are presenting an analysis of system requirements, possible disturbances and corrective measures that give the main directives for the design, construction and exploitation of the environmental early warning systems. Procedures which ensure data integrity and quality are mentioned. Finally, the contemporary system approach based on the LAN/WAN network topology with Intranet/Internet software is proposed, together with case descriptions of two already operating systems, based on computer-network principle.

  18. Evaluated data collections from ENSDF

    International Nuclear Information System (INIS)

    Ewbank, W.B.

    1979-01-01

    For several years the Nuclear Data Project has been maintaining an Evaluated Nuclear Structure Data File (ENSDF), which is designed to include critically evaluated values for most nuclear spectroscopic quantities. The information in ENSDF is the same as in the Nuclear Data Sheets, which illustrates two particular output formats (drawings and tables). Spectroscopic information for nuclei with A < 45 is put into ENSDF from the evaluations of Aizenberg-Selove and of Endt and van der Leun. An international network was organized to provide regular revisions of the data file. Computer facilities were developed to retrieve collections of evaluated data for special calculations or detailed examination

  19. Automatic control system in the reactor peggy

    International Nuclear Information System (INIS)

    Bertrand, J.; Mourchon, R.; Da Costa, D.; Desandre-Navarre, Ch.

    1967-01-01

    The equipment makes it possible for the reactor to attain a given power automatically and for the power to be maintained around this level. The principle of its operation consists in the changing from one power to another, at constant period, by means of a programmer transforming a power-step request into a voltage variation which is linear with time and which represents the logarithm of the required power. The real power is compared continuously with the required power. Stabilization occurs automatically as soon as the difference between the reactor power and the required power diminishes to a few per cent. (authors) [fr

  20. Advanced instrumentation for the collection, retrieval, and processing of urban stormwater data

    Science.gov (United States)

    Robinson, Jerald B.; Bales, Jerad D.; Young, Wendi S.; ,

    1995-01-01

    The U.S. Geological Survey, in cooperation with the City of Charlotte and Mecklenburg County, North Carolina, has developed a data-collection network that uses advanced instrumentation to automatically collect, retrieve, and process urban stormwater data. Precipitation measurement and water-quality networks provide data for (1) planned watershed simulation models, (2) early warning of possible flooding, (3) computation of material export, and (4) characterization of water quality in relation to basin conditions. Advantages of advanced instrumentation include remote access to real-time data, reduced demands on and more efficient use of limited human resources, and direct importation of data into a geographical information system for display and graphic analysis.

  1. Towards automation of data quality system for CERN CMS experiment

    Science.gov (United States)

    Borisyak, M.; Ratnikov, F.; Derkach, D.; Ustyuzhanin, A.

    2017-10-01

    Daily operation of a large-scale experiment is a challenging task, particularly from perspectives of routine monitoring of quality for data being taken. We describe an approach that uses Machine Learning for the automated system to monitor data quality, which is based on partial use of data qualified manually by detector experts. The system automatically classifies marginal cases: both of good an bad data, and use human expert decision to classify remaining “grey area” cases. This study uses collision data collected by the CMS experiment at LHC in 2010. We demonstrate that proposed workflow is able to automatically process at least 20% of samples without noticeable degradation of the result.

  2. User Metrics in NASA Earth Science Data Systems

    Science.gov (United States)

    Lynnes, Chris

    2018-01-01

    This presentation the collection and use of user metrics in NASA's Earth Science data systems. A variety of collection methods is discussed, with particular emphasis given to the American Customer Satisfaction Index (ASCI). User sentiment on potential use of cloud computing is presented, with generally positive responses. The presentation also discusses various forms of automatically collected metrics, including an example of the relative usage of different functions within the Giovanni analysis system.

  3. Computer control in nondestructive testing illustrated by an automatic ultrasonic tube inspection system

    International Nuclear Information System (INIS)

    Gundtoft, H.E.; Nielsen, N.

    1976-06-01

    In Risoe's automatic tube inspection system, data (more than half a million per tube) from ultrasonic dimension measurements and defect inspections are fed into a computer that simultaneously calculates and evaluates the results. (author)

  4. Automatic digital photo-book making system

    Science.gov (United States)

    Wang, Wiley; Teo, Patrick; Muzzolini, Russ

    2010-02-01

    The diversity of photo products has grown more than ever before. A group of photos are not only printed individually, but also can be arranged in specific order to tell a story, such as in a photo book, a calendar or a poster collage. Similar to making a traditional scrapbook, digital photo book tools allow the user to choose a book style/theme, layouts of pages, backgrounds and the way the pictures are arranged. This process is often time consuming to users, given the number of images and the choices of layout/background combinations. In this paper, we developed a system to automatically generate photo books with only a few initial selections required. The system utilizes time stamps, color indices, orientations and other image properties to best fit pictures into a final photo book. The common way of telling a story is to lay the pictures out in chronological order. If the pictures are proximate in time, they will coincide with each other and are often logically related. The pictures are naturally clustered along a time line. Breaks between clusters can be used as a guide to separate pages or spreads, thus, pictures that are logically related can stay close on the same page or spread. When people are making a photo book, it is helpful to start with chronologically grouped images, but time alone wont be enough to complete the process. Each page is limited by the number of layouts available. Many aesthetic rules also apply, such as, emphasis of preferred pictures, consistency of local image density throughout the whole book, matching a background to the content of the images, and the variety of adjacent page layouts. We developed an algorithm to group images onto pages under the constraints of aesthetic rules. We also apply content analysis based on the color and blurriness of each picture, to match backgrounds and to adjust page layouts. Some of our aesthetic rules are fixed and given by designers. Other aesthetic rules are statistic models trained by using

  5. Discrete Model Reference Adaptive Control System for Automatic Profiling Machine

    Directory of Open Access Journals (Sweden)

    Peng Song

    2012-01-01

    Full Text Available Automatic profiling machine is a movement system that has a high degree of parameter variation and high frequency of transient process, and it requires an accurate control in time. In this paper, the discrete model reference adaptive control system of automatic profiling machine is discussed. Firstly, the model of automatic profiling machine is presented according to the parameters of DC motor. Then the design of the discrete model reference adaptive control is proposed, and the control rules are proven. The results of simulation show that adaptive control system has favorable dynamic performances.

  6. Automatic Multimedia Creation Enriched with Dynamic Conceptual Data

    Directory of Open Access Journals (Sweden)

    Angel Martín

    2012-12-01

    Full Text Available There is a growing gap between the multimedia production and the context centric multimedia services. The main problem is the under-exploitation of the content creation design. The idea is to support dynamic content generation adapted to the user or display profile. Our work is an implementation of a web platform for automatic generation of multimedia presentations based on SMIL (Synchronized Multimedia Integration Language standard. The system is able to produce rich media with dynamic multimedia content retrieved automatically from different content databases matching the semantic context. For this purpose, we extend the standard interpretation of SMIL tags in order to accomplish a semantic translation of multimedia objects in database queries. This permits services to take benefit of production process to create customized content enhanced with real time information fed from databases. The described system has been successfully deployed to create advanced context centric weather forecasts.

  7. Automatic shading effects on the energetic performance of building systems; Efeito do sombreamento automatico no desempenho de sistemas prediais

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Racine Tadeu Araujo

    1996-12-31

    This thesis develops a theoretic-experimental study dealing with the effects of an automatic shading device on the energetic performance of a dimmable lighting system and a cooling equipment. Some equations related to fenestration optical and thermal properties are rebuilt, while some others are created, under a theoretical approach. In order to collect field data, the energy demand-and other variables - was measured in two distinct stories, with the same fenestration features, of the Test Tower. New data was gathered after adding an automatic shading device to the window of one story. The comparison of the collected data allows the energetic performance evaluation of the shading device. (author) 136 refs., 55 figs., 6 tabs.

  8. Automatic latency equalization in VHDL-implemented complex pipelined systems

    Science.gov (United States)

    Zabołotny, Wojciech M.

    2016-09-01

    In the pipelined data processing systems it is very important to ensure that parallel paths delay data by the same number of clock cycles. If that condition is not met, the processing blocks receive data not properly aligned in time and produce incorrect results. Manual equalization of latencies is a tedious and error-prone work. This paper presents an automatic method of latency equalization in systems described in VHDL. The proposed method uses simulation to measure latencies and verify introduced correction. The solution is portable between different simulation and synthesis tools. The method does not increase the complexity of the synthesized design comparing to the solution based on manual latency adjustment. The example implementation of the proposed methodology together with a simple design demonstrating its use is available as an open source project under BSD license.

  9. Automatic protein structure solution from weak X-ray data

    Science.gov (United States)

    Skubák, Pavol; Pannu, Navraj S.

    2013-11-01

    Determining new protein structures from X-ray diffraction data at low resolution or with a weak anomalous signal is a difficult and often an impossible task. Here we propose a multivariate algorithm that simultaneously combines the structure determination steps. In tests on over 140 real data sets from the protein data bank, we show that this combined approach can automatically build models where current algorithms fail, including an anisotropically diffracting 3.88 Å RNA polymerase II data set. The method seamlessly automates the process, is ideal for non-specialists and provides a mathematical framework for successfully combining various sources of information in image processing.

  10. Simple Approaches to Improve the Automatic Inventory of ZEBRA Crossing from Mls Data

    Science.gov (United States)

    Arias, P.; Riveiro, B.; Soilán, M.; Díaz-Vilariño, L.; Martínez-Sánchez, J.

    2015-08-01

    The city management is increasingly supported by information technologies, leading to paradigms such as smart cities, where decision-makers, companies and citizens are continuously interconnected. 3D modelling turns of great relevance when the city has to be managed making use of geospatial databases or Geographic Information Systems. On the other hand, laser scanning technology has experienced a significant growth in the last years, and particularly, terrestrial mobile laser scanning platforms are being more and more used with inventory purposes in both cities and road environments. Consequently, large datasets are available to produce the geometric basis for the city model; however, this data is not directly exploitable by management systems constraining the implementation of the technology for such applications. This paper presents a new algorithm for the automatic detection of zebra crossing. The algorithm is divided in three main steps: road segmentation (based on a PCA analysis of the points contained in each cycle of collected by a mobile laser system), rasterization (conversion of the point cloud to a raster image coloured as a function of intensity data), and zebra crossing detection (using the Hough Transform and logical constrains for line classification). After evaluating different datasets collected in three cities located in Northwest Spain (comprising 25 strips with 30 visible zebra crossings) a completeness of 83% was achieved.

  11. Weather station for scientific data collection

    Digital Repository Service at National Institute of Oceanography (India)

    Desai, R.G.P.; Mehra, P.; Desa, E.; Nagvekar, S.; Kumar, V.

    A state of are automatic weather station has been developed primarily to support, research programs on oceanographic and climatic studies. The basic system includes a suite of sensors for measurement of wind velocity, air temperature, barometric...

  12. Data Collection Analysis and Test

    Science.gov (United States)

    1980-12-01

    occurrences of minutia in the average rolled fingerprint . 18 11 111 im SIMPLE ARCH LOOP WHORL (SYMMETRICAL) Figure 2. 19 SIM~ rudimentaryIpre ridge Pore...System For Signatures . . . 16 3.2 FINGERPRINT DATA BASE .... ............ . 17 3.2.1 Characteristic Fe&tures .. ........... 18 3.2.2 Variations In...Features ... ............ 21 3.2.3 Proposed Data CAT System For Fingerprints . . 28 3.3 VOICE DATA BASE ..... ................ 30 3.3.1 Characteristic

  13. Matrix sentence intelligibility prediction using an automatic speech recognition system.

    Science.gov (United States)

    Schädler, Marc René; Warzybok, Anna; Hochmuth, Sabine; Kollmeier, Birger

    2015-01-01

    The feasibility of predicting the outcome of the German matrix sentence test for different types of stationary background noise using an automatic speech recognition (ASR) system was studied. Speech reception thresholds (SRT) of 50% intelligibility were predicted in seven noise conditions. The ASR system used Mel-frequency cepstral coefficients as a front-end and employed whole-word Hidden Markov models on the back-end side. The ASR system was trained and tested with noisy matrix sentences on a broad range of signal-to-noise ratios. The ASR-based predictions were compared to data from the literature ( Hochmuth et al, 2015 ) obtained with 10 native German listeners with normal hearing and predictions of the speech intelligibility index (SII). The ASR-based predictions showed a high and significant correlation (R² = 0.95, p speech and noise signals. Minimum assumptions were made about human speech processing already incorporated in a reference-free ordinary ASR system.

  14. Meteorological observatory for Antarctic data collection

    International Nuclear Information System (INIS)

    Grigioni, P.; De Silvestri, L.

    1996-01-01

    In the last years, a great number of automatic weather stations was installed in Antarctica, with the aim to examine closely the weather and climate of this region and to improve the coverage of measuring points on the Antarctic surface. In 1987 the Italian Antarctic Project started to set up a meteorological network, in an area not completely covered by other countries. Some of the activities performed by the meteorological observatory, concerning technical functions such as maintenance of the AWS's and the execution of radio soundings, or relating to scientific purposes such as validation and elaboration of collected data, are exposed. Finally, some climatological considerations on the thermal behaviour of the Antarctic troposphere such as 'coreless winter', and on the wind field, including katabatic flows in North Victoria Land are described

  15. Automatic removal of outliers in hydrologic time series and quality control of rainfall data: processing a real-time database of the Local System for Flood Monitoring in Klodzko County, Poland

    Science.gov (United States)

    Mizinski, Bartlomiej; Niedzielski, Tomasz; Kryza, Maciej; Szymanowski, Mariusz

    2013-04-01

    Real-time hydrological forecasting requires the highest quality of both hydrologic and meteorological data collected in a given river basin. Large outliers may lead to inaccurate predictions, with substantial departures between observations and prognoses considered even in short term. Although we need the correctness of both riverflow and rainfall data, they cannot be processed in the same way to produce a filtered output. Indeed, hydrologic time series at a given gauge can be interpolated in time domain after having detected suspicious values, however if no outlier has been detected at the upstream sites. In the case of rainfall data, interpolation is not suitable as we cannot verify the potential outliers at a given site against data from other sites especially in the complex terrain. This is due to the fact that very local convective events may occur, leading to large rainfall peaks at a limited space. Hence, instead of interpolating data, we rather perform a flagging procedure that only ranks outliers according to the likelihood of occurrence. Following the aforementioned assumptions, we have developed a few modules that serve a purpose of a fully automated correction of a database that is updated in real-time every 15 minutes, and the main objective of the work was to produce a high-quality database for a purpose of hydrologic rainfall-runoff modeling and ensemble prediction. The database in question is available courtesy of the County Office in Kłodzko (SW Poland), the institution which owns and maintains the Local System for Flood Monitoring in Kłodzko County. The dedicated prediction system, known as HydroProg, is now being built at the University of Wrocław (Poland). As the entire prediction system, the correction modules work automatically in real time and are developed in R language. They are plugged in to a larger IT infrastructure. Hydrologic time series, which are water levels recorded every 15 minutes at 22 gauges located in Kłodzko County, are

  16. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand's Official Statistics System

    Science.gov (United States)

    Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.

    2013-01-01

    Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231

  17. Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR

    Science.gov (United States)

    Gao, Yang; Zhong, Ruofei; Tang, Tao; Wang, Liuzhao; Liu, Xianlin

    2017-08-01

    Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness (p) and completeness (r) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR.

  18. Design of an automatic production monitoring system on job shop manufacturing

    Science.gov (United States)

    Prasetyo, Hoedi; Sugiarto, Yohanes; Rosyidi, Cucuk Nur

    2018-02-01

    Every production process requires monitoring system, so the desired efficiency and productivity can be monitored at any time. This system is also needed in the job shop type of manufacturing which is mainly influenced by the manufacturing lead time. Processing time is one of the factors that affect the manufacturing lead time. In a conventional company, the recording of processing time is done manually by the operator on a sheet of paper. This method is prone to errors. This paper aims to overcome this problem by creating a system which is able to record and monitor the processing time automatically. The solution is realized by utilizing electric current sensor, barcode, RFID, wireless network and windows-based application. An automatic monitoring device is attached to the production machine. It is equipped with a touch screen-LCD so that the operator can use it easily. Operator identity is recorded through RFID which is embedded in his ID card. The workpiece data are collected from the database by scanning the barcode listed on its monitoring sheet. A sensor is mounted on the machine to measure the actual machining time. The system's outputs are actual processing time and machine's capacity information. This system is connected wirelessly to a workshop planning application belongs to the firm. Test results indicated that all functions of the system can run properly. This system successfully enables supervisors, PPIC or higher level management staffs to monitor the processing time quickly with a better accuracy.

  19. Automatic Control System Switching Roadway Lighting

    OpenAIRE

    Agus Trimuji Susilo; Lingga Hermanto Drs. MM

    2002-01-01

    Lack of attention to the information officer street lights, cause is not exactly the timewhen the blame lights street lighting street - the street in this city protocol. And whenit was already dark, the lights had not lit, so it can harm the users of the road. Werecommend that when it got bright lights - the lights switched off late, so muchelectricity is wasted with nothing - nothing.Given the problems above, the automatic switching is required that can control all thelights - the existing l...

  20. Monitoring, analysis and classification of vegetation and soil data collected by a small and lightweight hyperspectral imaging system

    Science.gov (United States)

    Mönnig, Carsten

    2014-05-01

    The increasing precision of modern farming systems requires a near-real-time monitoring of agricultural crops in order to estimate soil condition, plant health and potential crop yield. For large sized agricultural plots, satellite imagery or aerial surveys can be used at considerable costs and possible time delays of days or even weeks. However, for small to medium sized plots, these monitoring approaches are cost-prohibitive and difficult to assess. Therefore, we propose within the INTERREG IV A-Project SMART INSPECTORS (Smart Aerial Test Rigs with Infrared Spectrometers and Radar), a cost effective, comparably simple approach to support farmers with a small and lightweight hyperspectral imaging system to collect remotely sensed data in spectral bands in between 400 to 1700nm. SMART INSPECTORS includes the whole remote sensing processing chain of small scale remote sensing from sensor construction, data processing and ground truthing for analysis of the results. The sensors are mounted on a remotely controlled (RC) Octocopter, a fixed wing RC airplane as well as on a two-seated Autogyro for larger plots. The high resolution images up to 5cm on the ground include spectra of visible light, near and thermal infrared as well as hyperspectral imagery. The data will be analyzed using remote sensing software and a Geographic Information System (GIS). The soil condition analysis includes soil humidity, temperature and roughness. Furthermore, a radar sensor is envisaged for the detection of geomorphologic, drainage and soil-plant roughness investigation. Plant health control includes drought stress, vegetation health, pest control, growth condition and canopy temperature. Different vegetation and soil indices will help to determine and understand soil conditions and plant traits. Additional investigation might include crop yield estimation of certain crops like apples, strawberries, pasture land, etc. The quality of remotely sensed vegetation data will be tested with

  1. Roadway weather information system and automatic vehicle location (AVL) coordination.

    Science.gov (United States)

    2011-02-28

    Roadway Weather Information System and Automatic Vehicle Location Coordination involves the : development of an Inclement Weather Console that provides a new capability for the state of Oklahoma : to monitor weather-related roadway conditions. The go...

  2. NASA Scientific Data Purchase Project: From Collection to User

    Science.gov (United States)

    Nicholson, Lamar; Policelli, Fritz; Fletcher, Rose

    2002-01-01

    NASA's Scientific Data Purchase (SDP) project is currently a $70 million operation managed by the Earth Science Applications Directorate at Stennis Space Center. The SDP project was developed in 1997 to purchase scientific data from commercial sources for distribution to NASA Earth science researchers. Our current data holdings include 8TB of remote sensing imagery consisting of 18 products from 4 companies. Our anticipated data volume is 60 TB by 2004, and we will be receiving new data products from several additional companies. Our current system capacity is 24 TB, expandable to 89 TB. Operations include tasking of new data collections, archive ordering, shipment verification, data validation, distribution, metrics, finances, customer feedback, and technical support. The program has been included in the Stennis Space Center Commercial Remote Sensing ISO 9001 registration since its inception. Our operational system includes automatic quality control checks on data received (with MatLab analysis); internally developed, custom Web-based interfaces that tie into commercial-off-the-shelf software; and an integrated relational database that links and tracks all data through operations. We've distributed nearly 1500 datasets, and almost 18,000 data files have been downloaded from our public web site; on a 10-point scale, our customer satisfaction index is 8.32 at a 23% response level. More information about the SDP is available on our Web site.

  3. The Diagnostic System of A – 604 Automatic Transmission

    Directory of Open Access Journals (Sweden)

    Czaban Jaroslaw

    2014-09-01

    Full Text Available Automatic gearbox gains increasing popularity in Europe. Little interest in diagnosis of such type of transmission in Poland results from the fact of small share in the whole market of operated cars, so there is a lack of availability of special diagnostic devices. These factors cause issues of expensive repairs, often involving a replacement of subassembly to new or aftermarket one. To a small extent some prophylactic diagnostic tests are conducted, which can eliminate future gearbox system failures. In the paper, the proposition of diagnostic system of popular A - 604 gearbox was presented. The authors are seeking for the possibility of using such type of devices to functional elaboration of gearboxes after renovation. The built system pursues the drive of the researched object, connected with simulated load, where special controller, replacing the original one, is responsible for controlling gearbox operation. This way is used to evaluate the mechanic and hydraulic parts' state. Analysis of signal runs, registered during measurements lets conclude about operation correctness, where as comparison with stock data verifies the technical state of an automatic gearbox.

  4. On the use of resubmissions in automatic assessment systems

    Science.gov (United States)

    Karavirta, Ville; Korhonen, Ari; Malmi, Lauri

    2006-09-01

    Automatic assessment systems generally support immediate grading and response on learners' submissions. They also allow learners to consider the feedback, revise, and resubmit their solutions. Several strategies exist to implement the resubmission policy. The ultimate goal, however, is to improve the learning outcomes, and thus the strategies should aim at preventing learners from using the resubmission feature irresponsibly. One of the key questions here is how to develop the system and its use in order to cut down such reiteration that does not seem to be worthwhile?In this paper, we study data gathered from an automatic assessment system that supports resubmissions. We use a clustering technique to draw a distinction among learner groups that seem to differ in their use of the resubmission feature and the points achieved from the exercises. By comparing these groups with each other, we conclude that for a small minority of learners there is a risk that they use the resubmission inefficiently. Some learners seem to resubmit the solution without thinking much between two consecutive submissions. In order to prevent such an aimless trial-and-error problem solving method, one option is to limit the number of allowed resubmissions. However, not all resubmissions are bad. In addition, there exist several ways to realize the limitations to achieve the best possible resubmission policy fit for all the students. These are discussed based on the evidence gathered during the research.

  5. Sodium component reliability data collection at CREDO

    International Nuclear Information System (INIS)

    Bott, T.F.; Haas, P.M.; Manning, J.J.

    1979-01-01

    The Centralized Reliability Data Organization (CREDO) has been established at Oak Ridge National Laboratory (ORNL) by the Department of Energy to provide a national center for collection, evaluation and dissemination of reliability data for advanced reactors. While the system is being developed and continuous data collection at the two U.S. reactor sites (EBR-II and FFTF) is being established, data on advanced reactor components which have been in use at U.S. test loops and experimental reactors have been collected and analyzed. Engineering, operating and event data on sodium valves, pumps, flow meters, rupture discs, heat exchangers and cold traps have been collected from more than a dozen sites. The results of analyses of the data performed to date are presented

  6. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    International Nuclear Information System (INIS)

    Pichara, Karim; Protopapas, Pavlos

    2013-01-01

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same

  7. Automatic diagnostic methods of nuclear reactor collected signals

    International Nuclear Information System (INIS)

    Lavison, P.

    1978-03-01

    This work is the first phase of an opwall study of diagnosis limited to problems of monitoring the operating state; this allows to show all what the pattern recognition methods bring at the processing level. The present problem is the research of the control operations. The analysis of the state of the reactor gives a decision which is compared with the history of the control operations, and if there is not correspondence, the state subjected to the analysis will be said 'abnormal''. The system subjected to the analysis is described and the problem to solve is defined. Then, one deals with the gaussian parametric approach and the methods to evaluate the error probability. After one deals with non parametric methods and an on-line detection has been tested experimentally. Finally a non linear transformation has been studied to reduce the error probability previously obtained. All the methods presented have been tested and compared to a quality index: the error probability [fr

  8. ANALYSIS OF SOFTWARE THREATS TO THE AUTOMATIC IDENTIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Marijan Gržan

    2017-01-01

    Full Text Available Automatic Identification System (AIS represents an important improvement in the fields of maritime security and vessel tracking. It is used by the signatory countries to the SOLAS Convention and by private and public providers. Its main advantage is that it can be used as an additional navigation aids, especially in avoiding collision at sea and in search and rescue operations. The present work analyses the functioning of the AIS System and the ways of exchanging data among the users. We also study one of the vulnerabilities of the System that can be abused by malicious users. The threat itself is analysed in detail in order to provide insight into the very process from the creation of a program to its implementation.

  9. Characteristics and design improvement of AP1000 automatic depressurization system

    International Nuclear Information System (INIS)

    Jin Fei

    2012-01-01

    Automatic depressurization system, as a specialty of AP1000 Design, enhances capability of mitigating design basis accidents for plant. Advancement of the system is discussed by comparing with traditional PWR design and analyzing system functions, such as depressurizing and venting. System design improvement during China Project performance is also described. At the end, suggestions for the system in China Project are listed. (author)

  10. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas.

    Science.gov (United States)

    Chang, Hsien-Tsung; Chang, Yi-Ming; Tsai, Meng-Tze

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning.

  11. Automatic measurement system for light element isotope analysis

    International Nuclear Information System (INIS)

    Satake, Hiroshi; Ikegami, Kouichi.

    1990-01-01

    The automatic measurement system for the light element isotope analysis was developed by installing the specially designed inlet system which was controlled by a computer. The microcomputer system contains specific interface boards for the inlet system and the mass spectrometer, Micromass 602 E. All the components of the inlet and the computer system installed are easily available in Japan. Ten samples can be automatically measured as a maximum of. About 160 minutes are required for 10 measurements of δ 18 O values of CO 2 . Thus four samples can be measured per an hour using this system, while usually three samples for an hour using the manual operation. The automatized analysis system clearly has an advantage over the conventional method. This paper describes the details of this automated system, such as apparatuses used, the control procedure and the correction for reliable measurement. (author)

  12. Automatic lesion tracking for a PET/CT based computer aided cancer therapy monitoring system

    Science.gov (United States)

    Opfer, Roland; Brenner, Winfried; Carlsen, Ingwer; Renisch, Steffen; Sabczynski, Jörg; Wiemker, Rafael

    2008-03-01

    Response assessment of cancer therapy is a crucial component towards a more effective and patient individualized cancer therapy. Integrated PET/CT systems provide the opportunity to combine morphologic with functional information. However, dealing simultaneously with several PET/CT scans poses a serious workflow problem. It can be a difficult and tedious task to extract response criteria based upon an integrated analysis of PET and CT images and to track these criteria over time. In order to improve the workflow for serial analysis of PET/CT scans we introduce in this paper a fast lesion tracking algorithm. We combine a global multi-resolution rigid registration algorithm with a local block matching and a local region growing algorithm. Whenever the user clicks on a lesion in the base-line PET scan the course of standardized uptake values (SUV) is automatically identified and shown to the user as a graph plot. We have validated our method by a data collection from 7 patients. Each patient underwent two or three PET/CT scans during the course of a cancer therapy. An experienced nuclear medicine physician manually measured the courses of the maximum SUVs for altogether 18 lesions. As a result we obtained that the automatic detection of the corresponding lesions resulted in SUV measurements which are nearly identical to the manually measured SUVs. Between 38 measured maximum SUVs derived from manual and automatic detected lesions we observed a correlation of 0.9994 and a average error of 0.4 SUV units.

  13. Data collection and processing for the ACES

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, J.L.; Miller, D.R.

    1981-08-01

    The Annual Cycle Energy System demonstration house furnishes information that is collected, processed, and analyzed on a weekly schedule. The computer codes used for processing and analyses were designed to display collected data; to summarize the performance (mechanical) of the house for each week; to give representation of external influences such as temperature, humidity ratio, and wind speed; and to aid in the dissemination of data to other users. Revisions and adjustments have been made to the codes to accommodate improvements made at the demonstration facility. The codes are written in either FORTRAN IV or Pl/I programming languages. All programs in the system run on the IBM 360 systems.

  14. Methodology of the Norwegian Surveillance System for Healthcare-Associated Infections: the value of a mandatory system, automated data collection, and active postdischarge surveillance.

    Science.gov (United States)

    Løwer, Hege Line; Eriksen, Hanne-Merete; Aavitsland, Preben; Skjeldestad, Finn Egil

    2013-07-01

    Surveillance is a primary component of systems for the prevention of health care-associated infections (HCAI). Feedback to surgeons from these surveillance systems may reduce rates of surgical site infections (SSIs) by approximately 20%. Our objective was to describe the Norwegian Surveillance System for Healthcare-Associated Infections' (NOIS) module for SSI (NOIS-SSI) and to evaluate the completeness of hospital participation, the effectiveness of automated data collection, and the added value of follow-up after hospital discharge during 2005 to 2009. NOIS was introduced by regulation in 2005. Hospital participation is described through adherence to the mandatory requirements and participation in the voluntary aspects of the system. Automated data collection is evaluated through the completeness of reporting of explanatory and administrative variables. The impact of active postdischarge surveillance is assessed through the completeness of follow-up and the proportion of infections detected after hospital discharge. The system has achieved 95% (52/55) hospital participation, with 65% (34/52) of the hospitals submitting more data than the required minimum. The completeness of patient and procedure-related background data is satisfactory, with 23.3% (5,079/21,772) of the records having at least 1 missing value. The completeness of 30-day follow-up of patients is 90.7% (19,747/21,772), and 81% (765/948) of the infections were detected after discharge from hospital. Implementation of a new surveillance system for SSI has been successful evaluated through hospital participation, the completeness of reporting of explanatory and administrative variables, and the completeness of postdischarge follow-up. Important success factors are a mandatory system, automated data-harvesting systems in hospitals, and active postdischarge surveillance. Copyright © 2013 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights

  15. Automatic Emergence Detection in Complex Systems

    Directory of Open Access Journals (Sweden)

    Eugene Santos

    2017-01-01

    Full Text Available Complex systems consist of multiple interacting subsystems, whose nonlinear interactions can result in unanticipated (emergent system events. Extant systems analysis approaches fail to detect such emergent properties, since they analyze each subsystem separately and arrive at decisions typically through linear aggregations of individual analysis results. In this paper, we propose a quantitative definition of emergence for complex systems. We also propose a framework to detect emergent properties given observations of its subsystems. This framework, based on a probabilistic graphical model called Bayesian Knowledge Bases (BKBs, learns individual subsystem dynamics from data, probabilistically and structurally fuses said dynamics into a single complex system dynamics, and detects emergent properties. Fusion is the central element of our approach to account for situations when a common variable may have different probabilistic distributions in different subsystems. We evaluate our detection performance against a baseline approach (Bayesian Network ensemble on synthetic testbeds from UCI datasets. To do so, we also introduce a method to simulate and a metric to measure discrepancies that occur with shared/common variables. Experiments demonstrate that our framework outperforms the baseline. In addition, we demonstrate that this framework has uniform polynomial time complexity across all three learning, fusion, and reasoning procedures.

  16. An Intelligent Tool for Activity Data Collection

    Directory of Open Access Journals (Sweden)

    A. M. Jehad Sarkar

    2011-04-01

    Full Text Available Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets.

  17. An intelligent tool for activity data collection.

    Science.gov (United States)

    Sarkar, A M Jehad

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.

  18. Adaptive intrusion data system

    International Nuclear Information System (INIS)

    Johnson, C.S.

    1976-01-01

    An Adaptive Intrusion Data System (AIDS) was developed to collect data from intrusion alarm sensors as part of an evaluation system to improve sensor performance. AIDS is a unique digital data compression, storage, and formatting system. It also incorporates capability for video selection and recording for assessment of the sensors monitored by the system. The system is software reprogrammable to numerous configurations that may be utilized for the collection of environmental, bi-level, analog and video data. The output of the system is digital tapes formatted for direct data reduction on a CDC 6400 computer, and video tapes containing timed tagged information that can be correlated with the digital data

  19. SABER-School Finance: Data Collection Instrument

    Science.gov (United States)

    King, Elizabeth; Patrinos, Harry; Rogers, Halsey

    2015-01-01

    The aim of the SABER-school finance initiative is to collect, analyze and disseminate comparable data about education finance systems across countries. SABER-school finance assesses education finance systems along six policy goals: (i) ensuring basic conditions for learning; (ii) monitoring learning conditions and outcomes; (iii) overseeing…

  20. 24 CFR 902.60 - Data collection.

    Science.gov (United States)

    2010-04-01

    ... PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.60 Data collection. (a) Fiscal Year reporting period—limitation on changes after PHAS effectiveness. An assessed fiscal year for purposes of the PHAS corresponds... transmission of the data. (c) Financial condition information. Year-end financial information to conduct the...

  1. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  2. Automatic radiation measuring system connected with GPS

    International Nuclear Information System (INIS)

    Tanigaki, Minoru

    2014-01-01

    The most serious nuclear disaster in Japan has broken out at Fukushima Daiichi Nuclear Power Plant due to Great East Japan Earthquake. Prompt and exact mapping of the contamination is of great importance for radiation protection and for the environment restoration. We have developed radiation survey systems KURAMA and KURAMA-2 for rapid and exact measurement of radiation dose distribution. The system is composed of a mobile radiation monitor and the computer in office which is for the storage and visualization of the data. They are connected with internet and are operated for continuous radiation measurement while the monitor is moving. The mobile part consists of a survey meter, an interface to transform the output of the survey meter for the computer, a global positioning system, a computer to process the data for connecting to the network, and a mobile router. Thus they are effective for rapid mapping of the surface contamination. The operation and the performance of the equipment at the site are presented. (J.P.N.)

  3. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  4. A Flexible Dynamic System for Automatic Grading of Programming Exercises

    OpenAIRE

    Fonte, Daniela; Cruz, Daniela da; Gançarski, Alda Lopes; Henriques, Pedro Rangel

    2013-01-01

    The research on programs capable to automatically grade source code has been a subject of great interest to many researchers. Automatic Grading Systems (AGS) were born to support programming courses and gained popularity due to their ability to assess, evaluate, grade and manage the students' programming exercises, saving teachers from this manual task. This paper discusses semantic analysis techniques, and how they can be applied to improve the validation and assessment pr...

  5. An automatic system for acidity determination based on sequential injection titration and the monosegmented flow approach.

    Science.gov (United States)

    Kozak, Joanna; Wójtowicz, Marzena; Gawenda, Nadzieja; Kościelniak, Paweł

    2011-06-15

    An automatic sequential injection system, combining monosegmented flow analysis, sequential injection analysis and sequential injection titration is proposed for acidity determination. The system enables controllable sample dilution and generation of standards of required concentration in a monosegmented sequential injection manner, sequential injection titration of the prepared solutions, data collecting, and handling. It has been tested on spectrophotometric determination of acetic, citric and phosphoric acids with sodium hydroxide used as a titrant and phenolphthalein or thymolphthalein (in the case of phosphoric acid determination) as indicators. Accuracy better than |4.4|% (RE) and repeatability better than 2.9% (RSD) have been obtained. It has been applied to the determination of total acidity in vinegars and various soft drinks. The system provides low sample (less than 0.3 mL) consumption. On average, analysis of a sample takes several minutes. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Automatic feed system for ultrasonic machining

    Science.gov (United States)

    Calkins, Noel C.

    1994-01-01

    Method and apparatus for ultrasonic machining in which feeding of a tool assembly holding a machining tool toward a workpiece is accomplished automatically. In ultrasonic machining, a tool located just above a workpiece and vibrating in a vertical direction imparts vertical movement to particles of abrasive material which then remove material from the workpiece. The tool does not contact the workpiece. Apparatus for moving the tool assembly vertically is provided such that it operates with a relatively small amount of friction. Adjustable counterbalance means is provided which allows the tool to be immobilized in its vertical travel. A downward force, termed overbalance force, is applied to the tool assembly. The overbalance force causes the tool to move toward the workpiece as material is removed from the workpiece.

  7. A System for Automatically Generating Scheduling Heuristics

    Science.gov (United States)

    Morris, Robert

    1996-01-01

    The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.

  8. MAD data collection - current trends

    International Nuclear Information System (INIS)

    Dementieva, I.; Evans, G.; Joachimiak, A.; Sanishvili, R.; Walsh, M. A.

    1999-01-01

    The multi-wavelength anomalous diffraction, or MAD, method of determining protein structure is becoming routine in protein crystallography. An increase in the number of tuneable synchrotrons beamlines coupled with the widespread availability position-sensitive X-ray detectors based on charged-coupled devices and having fast readout raised MAD structure determination to a new and exciting level. Ultra-fast MAD data collection is now possible. Recognition of the value of selenium for phasing protein structures and improvement of methods for incorporating selenium into proteins in the form of selenomethionine have attracted greater interest in the MAD method. Recent developments in crystallographic software are complimenting the above advances, paving the way for rapid protein structure determination. An overview of a typical MAD experiment is described here, with emphasis on the rates and quality of data acquisition now achievable at beamlines developed at third-generation synchrotrons sources

  9. Power amplifier automatic test system based on LXI bus technology

    Science.gov (United States)

    Li, Yushuang; Chen, Libing; Men, Tao; Yang, Qingfeng; Li, Ning; Nie, Tao

    2017-10-01

    The power amplifier is an important part of the high power digital transceiver module, because of its great demand and diverse measurement indicators, an automatic test system is designed to meet the production requirements of the power amplifiers as the manual test cannot meet the demand of consistency. This paper puts forward the plan of the automatic test system based on LXI bus technology, introduces the hardware and software architecture of the system. The test system has been used for debugging and testing the power amplifiers stably and efficiently, which greatly saves work force and effectively improves productivity.

  10. Human visual system automatically encodes sequential regularities of discrete events.

    Science.gov (United States)

    Kimura, Motohiro; Schröger, Erich; Czigler, István; Ohira, Hideki

    2010-06-01

    For our adaptive behavior in a dynamically changing environment, an essential task of the brain is to automatically encode sequential regularities inherent in the environment into a memory representation. Recent studies in neuroscience have suggested that sequential regularities embedded in discrete sensory events are automatically encoded into a memory representation at the level of the sensory system. This notion is largely supported by evidence from investigations using auditory mismatch negativity (auditory MMN), an event-related brain potential (ERP) correlate of an automatic memory-mismatch process in the auditory sensory system. However, it is still largely unclear whether or not this notion can be generalized to other sensory modalities. The purpose of the present study was to investigate the contribution of the visual sensory system to the automatic encoding of sequential regularities using visual mismatch negativity (visual MMN), an ERP correlate of an automatic memory-mismatch process in the visual sensory system. To this end, we conducted a sequential analysis of visual MMN in an oddball sequence consisting of infrequent deviant and frequent standard stimuli, and tested whether the underlying memory representation of visual MMN generation contains only a sensory memory trace of standard stimuli (trace-mismatch hypothesis) or whether it also contains sequential regularities extracted from the repetitive standard sequence (regularity-violation hypothesis). The results showed that visual MMN was elicited by first deviant (deviant stimuli following at least one standard stimulus), second deviant (deviant stimuli immediately following first deviant), and first standard (standard stimuli immediately following first deviant), but not by second standard (standard stimuli immediately following first standard). These results are consistent with the regularity-violation hypothesis, suggesting that the visual sensory system automatically encodes sequential

  11. Automatic Management of Parallel and Distributed System Resources

    Science.gov (United States)

    Yan, Jerry; Ngai, Tin Fook; Lundstrom, Stephen F.

    1990-01-01

    Viewgraphs on automatic management of parallel and distributed system resources are presented. Topics covered include: parallel applications; intelligent management of multiprocessing systems; performance evaluation of parallel architecture; dynamic concurrent programs; compiler-directed system approach; lattice gaseous cellular automata; and sparse matrix Cholesky factorization.

  12. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  13. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  14. System of acquisition and analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Vaubert, Y.; Birac, A.M.; Saglio, R.

    1982-08-01

    An original system of acquisition and analysis of ultrasonic data collected during examinations named STADUS-PRODUS has been developed by C.E.A. in Saclay. First developed for the needs of in-service inspection of PWR vessels, it is now used for the different automatic ultrasonic controls with various tools

  15. Data collection architecture for big data - A framework for a research agenda

    NARCIS (Netherlands)

    Hofman, W.J.

    2015-01-01

    As big data is expected to contribute largely to economic growth, scalability of solutions becomes apparent for deployment by organisations. It requires automatic collection and processing of large, heterogeneous data sets of a variety of resources, dealing with various aspects like improving

  16. Approximate Sensory Data Collection: A Survey

    Directory of Open Access Journals (Sweden)

    Siyao Cheng

    2017-03-01

    Full Text Available With the rapid development of the Internet of Things (IoTs, wireless sensor networks (WSNs and related techniques, the amount of sensory data manifests an explosive growth. In some applications of IoTs and WSNs, the size of sensory data has already exceeded several petabytes annually, which brings too many troubles and challenges for the data collection, which is a primary operation in IoTs and WSNs. Since the exact data collection is not affordable for many WSN and IoT systems due to the limitations on bandwidth and energy, many approximate data collection algorithms have been proposed in the last decade. This survey reviews the state of the art of approximatedatacollectionalgorithms. Weclassifythemintothreecategories: themodel-basedones, the compressive sensing based ones, and the query-driven ones. For each category of algorithms, the advantages and disadvantages are elaborated, some challenges and unsolved problems are pointed out, and the research prospects are forecasted.

  17. Approximate Sensory Data Collection: A Survey.

    Science.gov (United States)

    Cheng, Siyao; Cai, Zhipeng; Li, Jianzhong

    2017-03-10

    With the rapid development of the Internet of Things (IoTs), wireless sensor networks (WSNs) and related techniques, the amount of sensory data manifests an explosive growth. In some applications of IoTs and WSNs, the size of sensory data has already exceeded several petabytes annually, which brings too many troubles and challenges for the data collection, which is a primary operation in IoTs and WSNs. Since the exact data collection is not affordable for many WSN and IoT systems due to the limitations on bandwidth and energy, many approximate data collection algorithms have been proposed in the last decade. This survey reviews the state of the art of approximatedatacollectionalgorithms. Weclassifythemintothreecategories: themodel-basedones, the compressive sensing based ones, and the query-driven ones. For each category of algorithms, the advantages and disadvantages are elaborated, some challenges and unsolved problems are pointed out, and the research prospects are forecasted.

  18. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database...... by means of spatial resection. This paper describes in details the mentioned procedure as it was used and implemented during tests with two data sets from Denmark. Moreover, the results from a test made with a data set from the Czech Republic are added. It brought a different view to this complex...... of problems with respect to a different landscape and quality of input data. Finally, some ideas for improving and generalising the method are suggested....

  19. Formal Specification Based Automatic Test Generation for Embedded Network Systems

    Directory of Open Access Journals (Sweden)

    Eun Hye Choi

    2014-01-01

    Full Text Available Embedded systems have become increasingly connected and communicate with each other, forming large-scaled and complicated network systems. To make their design and testing more reliable and robust, this paper proposes a formal specification language called SENS and a SENS-based automatic test generation tool called TGSENS. Our approach is summarized as follows: (1 A user describes requirements of target embedded network systems by logical property-based constraints using SENS. (2 Given SENS specifications, test cases are automatically generated using a SAT-based solver. Filtering mechanisms to select efficient test cases are also available in our tool. (3 In addition, given a testing goal by the user, test sequences are automatically extracted from exhaustive test cases. We’ve implemented our approach and conducted several experiments on practical case studies. Through the experiments, we confirmed the efficiency of our approach in design and test generation of real embedded air-conditioning network systems.

  20. Temporally rendered automatic cloud extraction (TRACE) system

    Science.gov (United States)

    Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.

    1999-10-01

    Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.

  1. Automatic gear sorting system based on monocular vision

    Directory of Open Access Journals (Sweden)

    Wenqi Wu

    2015-11-01

    Full Text Available An automatic gear sorting system based on monocular vision is proposed in this paper. A CCD camera fixed on the top of the sorting system is used to obtain the images of the gears on the conveyor belt. The gears׳ features including number of holes, number of teeth and color are extracted, which is used to categorize the gears. Photoelectric sensors are used to locate the gears׳ position and produce the trigger signals for pneumatic cylinders. The automatic gear sorting is achieved by using pneumatic actuators to push different gears into their corresponding storage boxes. The experimental results verify the validity and reliability of the proposed method and system.

  2. Signal system data mining

    Science.gov (United States)

    2000-09-01

    Intelligent transportation systems (ITS) include large numbers of traffic sensors that collect enormous quantities of data. The data provided by ITS is necessary for advanced forms of control, however basic forms of control, primarily time-of-day (TO...

  3. Longline Observer Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — LODS, the Hawaii Longline Observer Data System, is a complete suite of tools designed to collect, process, and manage quality fisheries data and information. Guided...

  4. Toward Global Comparability of Sexual Orientation Data in Official Statistics: A Conceptual Framework of Sexual Orientation for Health Data Collection in New Zealand’s Official Statistics System

    Directory of Open Access Journals (Sweden)

    Frank Pega

    2013-01-01

    Full Text Available Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand’s Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens.

  5. Water sample-collection and distribution system

    Science.gov (United States)

    Brooks, R. R.

    1978-01-01

    Collection and distribution system samples water from six designated stations, filtered if desired, and delivers it to various analytical sensors. System may be controlled by Water Monitoring Data Acquisition System or operated manually.

  6. Developing an intelligent control system of automatic window motor ...

    Indian Academy of Sciences (India)

    This invention system involves hardware, firmware and software to develop an intelligent control system of automatic window motor with diverse wireless sensor network (WSN) devices for health and environmental monitoring. The parts of this invention are improved by implementing the WSN mote into environmental ...

  7. Full-automatic Special Drill Hydraulic System and PLC Control

    Directory of Open Access Journals (Sweden)

    Tian Xue Jun

    2016-01-01

    Full Text Available A hydraulic-driven and PLC full-automatic special drill is introduced, working principle of the hydraulic system and PLC control system are analyzed and designed, this equipment has the advantages of high efficiency, superior quality and low cost etc.

  8. Automatic generation control of interconnected power system with ...

    African Journals Online (AJOL)

    In this paper, automatic generation control (AGC) of two area interconnected power system having diverse sources of power generation is studied. A two area power system comprises power generations from hydro, thermal and gas sources in area-1 and power generations from hydro and thermal sources in area-2. All the ...

  9. Automatic design of optical systems by digital computer

    Science.gov (United States)

    Casad, T. A.; Schmidt, L. F.

    1967-01-01

    Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.

  10. Automatic Dialogue Scoring for a Second Language Learning System

    Science.gov (United States)

    Huang, Jin-Xia; Lee, Kyung-Soon; Kwon, Oh-Woog; Kim, Young-Kil

    2016-01-01

    This paper presents an automatic dialogue scoring approach for a Dialogue-Based Computer-Assisted Language Learning (DB-CALL) system, which helps users learn language via interactive conversations. The system produces overall feedback according to dialogue scoring to help the learner know which parts should be more focused on. The scoring measures…

  11. Evaluation of the SYSTRAN Automatic Translation System. Report No. 5.

    Science.gov (United States)

    Chaumier, Jacques; And Others

    The Commission of the European Communities has acquired an automatic translation system (SYSTRAN), which has been put into operation on an experimental basis. The system covers translation of English into French and comprises a dictionary for food science and technology containing 25,000 words or inflections and 4,500 expressions. This report…

  12. ClinData Express--a metadata driven clinical research data management system for secondary use of clinical data.

    Science.gov (United States)

    Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei

    2012-01-01

    Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress.

  13. Tracking and data collection of smart munitions

    Science.gov (United States)

    Stufflebeam, Joseph L.; Salvatti, Fred

    1995-05-01

    A VME based real-time control system has been developed for use in the testing of smart munition weapons systems. The testing of advanced multimunition systems requires a platform that has not only a robust and stable servo control loop, but also a data collection platform that is capable of acquiring and tagging a wide range of sensory data. The data collection scheme must be able to handle synchronous, asynchronous, and multiframe rate sensor inputs and be capable of handling changing modes of operation in real-time. To meet these requirements, a DSP platform was utilized for the servo control loops, while programmable hardware logic was utilized to allow deterministic strobing of the time and pointing information. Discussions of the imaging requirements for this application, and limitations and uncertainties involved with optical tracking measurements are presented.

  14. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  15. A semi-automatic parachute separation system for balloon payloads

    Science.gov (United States)

    Farman, M.

    survey was carried out to choose a suitable tilt sensor and three prototype systems were built for evaluation. These were installed in standard NSBF terminate units, and flown on routine operational flights throughout 2001 with the automatic pyrotechnic cutter active but off-line. A data logger was also installed to record system parameters during the descent phase. The results of these flights validated the system concept and it was found that the telemetry threshold monitor was also an asset to the operator in deciding when it was safe to send a manual parachute release command. However, the accumu lated test experience indicated that the originally- chosen tilt sensor, which uses a liquid electrolyte and requires an in-flight microprocessor, was not sufficiently rugged or reliable. A solid-state accelerometer, with encapsulated analog signal processing, was therefore selected as a replacement and the threshold electronics redesigned to match this sensor. This system is currently being evaluated on NSBF operation al flights during 2002. On completion of this phase, NASA will review the results and a decision will be made whether to use this design as the primary operational system on future flights. This paper discusses the requirements for such a system and describes the current design in detail. It reports on the evaluation flights of 2001 and 2002 and their results to date.

  16. Collecting data in real time with postcards

    DEFF Research Database (Denmark)

    Yee, Kwang Chien; Kanstrup, Anne Marie; Bertelsen, Pernille

    2013-01-01

    Systems. These methods often involve cross-sectional, retrospective data collection. This paper describes the postcard method for prospective real-time data collection, both in paper format and electronic format. This paper then describes the results obtained using postcard techniques in Denmark...... and Australia. The benefits of this technique are illustrated. There are limitations in using postcard techniques and this paper provides a detail discussion about these limitations. Postcard techniques provide unique advantages in understanding real time healthcare context and it is an important technique...

  17. Automatic Web Data Extraction Based on Genetic Algorithms and Regular Expressions

    Science.gov (United States)

    Barrero, David F.; Camacho, David; R-Moreno, María D.

    Data Extraction from the World Wide Web is a well known, unsolved, and critical problem when complex information systems are designed. These problems are related to the extraction, management and reuse of the huge amount ofWeb data available. These data usually has a high heterogeneity, volatility and low quality (i.e. format and content mistakes), so it is quite hard to build reliable systems. This chapter proposes an Evolutionary Computation approach to the problem of automatically learn software entities based on Genetic Algorithms and regular expressions. These entities, also called wrappers, will be able to extract some kind of Web data structures from examples.

  18. Automatic sociophonetics: Exploring corpora with a forensic accent recognition system.

    Science.gov (United States)

    Brown, Georgina; Wormald, Jessica

    2017-07-01

    This paper demonstrates how the Y-ACCDIST system, the York ACCDIST-based automatic accent recognition system [Brown (2015). Proceedings of the International Congress of Phonetic Sciences, Glasgow, UK], can be used to inspect sociophonetic corpora as a preliminary "screening" tool. Although Y-ACCDIST's intended application is to assist with forensic casework, the system can also be exploited in sociophonetic research to begin unpacking variation. Using a subset of the PEBL (Panjabi-English in Bradford and Leicester) corpus, the outputs of Y-ACCDIST are explored, which, it is argued, efficiently and objectively assess speaker similarities across different linguistic varieties. The ways these outputs corroborate with a phonetic analysis of the data are also discovered. First, Y-ACCDIST is used to classify speakers from the corpus based on language background and region. A Y-ACCDIST cluster analysis is then implemented, which groups speakers in ways consistent with more localised networks, providing a means of identifying potential communities of practice. Additionally, the results of a Y-ACCDIST feature selection task that indicates which specific phonemes are most valuable in distinguishing between speaker groups are presented. How Y-ACCDIST outputs can be used to reinforce more traditional sociophonetic analyses and support qualitative interpretations of the data is demonstrated.

  19. A versatile Czochralski crystal growth system with automatic diameter control

    Science.gov (United States)

    Aggarwal, M. D.; Metzl, R.; Wang, W. S.; Choi, J.

    1995-07-01

    A versatile Czochralski crystal pulling system with automatic diameter control for the growth of nonlinear optical oxide crystals is discussed. Pure and doped bulk single crystals of bismuth silicon oxide (Bi12SiO20) have been successfully grown using this system. The system consists of a regular Czochralski type pulling system with provision for continuous weighing of the growing crystal to provide feedback for power control.

  20. Automated system for data acquisition and monitoring

    Directory of Open Access Journals (Sweden)

    Borza Sorin

    2017-01-01

    Full Text Available The Environmental management has become, with the development of human society a very important issue. There have been multiple systems that automatically monitors the environment. In this paper we propose a system that integrates GIS software and data acquisition software. In addition the proposed system implements new AHP multicriteria method that can get an answer online on each pollutant influence on limited geographical area in which the monitors. Factors pollutants of limited geographical areas are taken automatically by specific sensors through acquisition board. Labview software, with virtual instrument created by transferring them into a database Access. Access database they are taken up by software Geomedia Professional and processed using multi-criteria method AHP, so that at any moment, their influence on the environment and classify these influences, can be plotted on the screen monitoring system. The system allows, the automatic collection of data, the memorization and the generation of GIS elements. The research presented in this paper were aimed at implementing multi-criteria methods in GIS software.

  1. Crackscope : automatic pavement cracking inspection system.

    Science.gov (United States)

    2008-08-01

    The CrackScope system is an automated pavement crack rating system consisting of a : digital line scan camera, laser-line illuminator, and proprietary crack detection and classification : software. CrackScope is able to perform real-time pavement ins...

  2. Diagnosis - Using automatic test equipment and artificial intelligence expert systems

    Science.gov (United States)

    Ramsey, J. E., Jr.

    Three expert systems (ATEOPS, ATEFEXPERS, and ATEFATLAS), which were created to direct automatic test equipment (ATE), are reviewed. The purpose of the project was to develop an expert system to troubleshoot the converter-programmer power supply card for the F-15 aircraft and have that expert system direct the automatic test equipment. Each expert system uses a different knowledge base or inference engine, basing the testing on the circuit schematic, test requirements document, or ATLAS code. Implementing generalized modules allows the expert systems to be used for any different unit under test. Using converted ATLAS to LISP code allows the expert system to direct any ATE using ATLAS. The constraint propagated frame system allows for the expansion of control by creating the ATLAS code, checking the code for good software engineering techniques, directing the ATE, and changing the test sequence as needed (planning).

  3. THEMIS Data and Software Systems

    Science.gov (United States)

    Goethel, C.; Angelopoulos, V.

    2009-12-01

    THEMIS consists of five spacecraft and 31 ground observatories, including 10 education and public outreach sites. The spacecraft carry a comprehensive suite of particle and field instruments providing measurements with different sampling rates and modes, including survey and burst collection. The distributed array of ground based observatories equipped with 21 all-sky imagers and 31 ground magnetometers provide continuous monitoring of aurora and magnetic field variations from Alaska to Greenland. Data are automatically processed within hours of receipt, stored in daily Common Data Format (CDF) files, plotted and distributed along with corresponding calibration files via a central site at SSL/UCB and several mirror sites worldwide. THEMIS software is an open source, platform independent, IDL-based library of utilities. The system enables downloads of calibrated (L2) or raw (L1) data, data analysis, ingestion of data from other missions and ground stations, and production of publication quality plots. The combination of a user-friendly graphical user interface and a command line interface support a wide range of users. In addition, IDL scripts (crib sheets) are provided for manipulation of THEMIS and ancillary data sets. The system design philosophy will be described along with examples to demonstrate the software capabilities in streamlining data/software distribution and exchange, thereby further enhancing science productivity.

  4. System for automatic detection of lung nodules exhibiting growth

    Science.gov (United States)

    Novak, Carol L.; Shen, Hong; Odry, Benjamin L.; Ko, Jane P.; Naidich, David P.

    2004-05-01

    Lung nodules that exhibit growth over time are considered highly suspicious for malignancy. We present a completely automated system for detection of growing lung nodules, using initial and follow-up multi-slice CT studies. The system begins with automatic detection of lung nodules in the later CT study, generating a preliminary list of candidate nodules. Next an automatic system for registering locations in two studies matches each candidate in the later study to its corresponding position in the earlier study. Then a method for automatic segmentation of lung nodules is applied to each candidate and its matching location, and the computed volumes are compared. The output of the system is a list of nodule candidates that are new or have exhibited volumetric growth since the previous scan. In a preliminary test of 10 patients examined by two radiologists, the automatic system identified 18 candidates as growing nodules. 7 (39%) of these corresponded to validated nodules or other focal abnormalities that exhibited growth. 4 of the 7 true detections had not been identified by either of the radiologists during their initial examinations of the studies. This technique represents a powerful method of surveillance that may reduce the probability of missing subtle or early malignant disease.

  5. Automatic aeroponic irrigation system based on Arduino’s platform

    Science.gov (United States)

    Montoya, A. P.; Obando, F. A.; Morales, J. G.; Vargas, G.

    2017-06-01

    The recirculating hydroponic culture techniques, as aeroponics, has several advantages over traditional agriculture, aimed to improve the efficiently and environmental impact of agriculture. These techniques require continuous monitoring and automation for proper operation. In this work was developed an automatic monitored aeroponic-irrigation system based on the Arduino’s free software platform. Analog and digital sensors for measuring the temperature, flow and level of a nutrient solution in a real greenhouse were implemented. In addition, the pH and electric conductivity of nutritive solutions are monitored using the Arduino’s differential configuration. The sensor network, the acquisition and automation system are managed by two Arduinos modules in master-slave configuration, which communicate one each other wireless by Wi-Fi. Further, data are stored in micro SD memories and the information is loaded on a web page in real time. The developed device brings important agronomic information when is tested with an arugula culture (Eruca sativa Mill). The system also could be employ as an early warning system to prevent irrigation malfunctions.

  6. Microcontroller based automatic liquid poison addition control system

    International Nuclear Information System (INIS)

    Kapatral, R.S.; Ananthakrishnan, T.S.; Pansare, M.G.

    1989-01-01

    Microcontrollers are finding increasing applications in instrumentation where complex digital circuits can be substituted by a compact and simple circuit, thus enhancing the reliability. In addition to this, intelligence and flexibility can be incorporated. For applications not requiring large amount of read/write memory (RAM), microcontrollers are ideally suited since they contain programmable memory (Eprom), parallel input/output lines, data memory, programmable timers and serial interface ports in one chip. This paper describes the design of automatic liquid poison addition control system (ALPAS) using intel's 8 bit microcontroller 8751, which is used to generate complex timing control sequence signals for liquid poison addition to the moderator in a nuclear reactor. ALPAS monitors digital inputs coming from protection system and regulating system of a nuclear reactor and provides control signals for liquid poison addition for long term safe shutdown of the reactor after reactor trip and helps the regulating system to reduce the power of the reactor during operation. Special hardware and software features have been incorporated to improve performance and fault detection. (author)

  7. Vital Recorder-a free research tool for automatic recording of high-resolution time-synchronised physiological data from multiple anaesthesia devices.

    Science.gov (United States)

    Lee, Hyung-Chul; Jung, Chul-Woo

    2018-01-24

    The current anaesthesia information management system (AIMS) has limited capability for the acquisition of high-quality vital signs data. We have developed a Vital Recorder program to overcome the disadvantages of AIMS and to support research. Physiological data of surgical patients were collected from 10 operating rooms using the Vital Recorder. The basic equipment used were a patient monitor, the anaesthesia machine, and the bispectral index (BIS) monitor. Infusion pumps, cardiac output monitors, regional oximeter, and rapid infusion device were added as required. The automatic recording option was used exclusively and the status of recording was frequently checked through web monitoring. Automatic recording was successful in 98.5% (4,272/4,335) cases during eight months of operation. The total recorded time was 13,489 h (3.2 ± 1.9 h/case). The Vital Recorder's automatic recording and remote monitoring capabilities enabled us to record physiological big data with minimal effort. The Vital Recorder also provided time-synchronised data captured from a variety of devices to facilitate an integrated analysis of vital signs data. The free distribution of the Vital Recorder is expected to improve data access for researchers attempting physiological data studies and to eliminate inequalities in research opportunities due to differences in data collection capabilities.

  8. HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation

    Science.gov (United States)

    Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.

    2006-03-01

    As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.

  9. Automatic Tracking Evaluation and Development System (ATEDS)

    Data.gov (United States)

    Federal Laboratory Consortium — The heart of the ATEDS network consists of four SGI Octane computers running the IRIX operating system and equipped with V12 hardware graphics to support synthetic...

  10. Improving SAR Automatic Target Recognition Models with Transfer Learning from Simulated Data

    DEFF Research Database (Denmark)

    Malmgren-Hansen, David; Kusk, Anders; Dall, Jørgen

    2017-01-01

    SAR images of sufficient size, simulated data play a big role in SAR ATR development, but the transferability of knowledge learned on simulated data to real data remains to be studied further. In this letter, we show the first study of Transfer Learning between a simulated data set and a set of real....... These results encourage SAR ATR development to continue the improvement of simulated data sets of greater size and complex scenarios in order to build robust algorithms for real life SAR ATR applications.......Data-driven classification algorithms have proved to do well for automatic target recognition (ATR) in synthetic aperture radar (SAR) data. Collecting data sets suitable for these algorithms is a challenge in itself as it is difficult and expensive. Due to the lack of labeled data sets with real...

  11. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  12. Robust Fallback Scheme for the Danish Automatic Voltage Control System

    DEFF Research Database (Denmark)

    Qin, Nan; Dmitrova, Evgenia; Lund, Torsten

    2015-01-01

    This paper proposes a fallback scheme for the Danish automatic voltage control system. It will be activated in case of the local station loses telecommunication to the control center and/or the local station voltage violates the acceptable operational limits. It cuts in/out switchable and tap...

  13. The automatic liquid nitrogen filling system for GDA detectors

    Indian Academy of Sciences (India)

    . Abstract. An indigenously developed automatic liquid nitrogen (LN2) filling system has been installed in gamma detector array (GDA) facility at Nuclear Science Centre. Electro-pneumatic valves are used for filling the liquid nitrogen into the ...

  14. New on the market: Heating system for automatic operation

    Energy Technology Data Exchange (ETDEWEB)

    1981-04-01

    In the report on the 11th Meeting of Experts of Sanitary, Heating, Air Condition several furnaces, especially furnaces for solid fuels, are introduced. Automatic charging systems are offered for a comfortable burning of coal and coke. But the main concern of all furnace producers is the improvement of efficiency.

  15. Auditory signal design for automatic number plate recognition system

    NARCIS (Netherlands)

    Heydra, C.G.; Jansen, R.J.; Van Egmond, R.

    2014-01-01

    This paper focuses on the design of an auditory signal for the Automatic Number Plate Recognition system of Dutch national police. The auditory signal is designed to alert police officers of suspicious cars in their proximity, communicating priority level and location of the suspicious car and

  16. Communication interface of computerized automatic fire alarm system

    International Nuclear Information System (INIS)

    Yu Hongmei; Zhu Liqun; Fang Shaohong; Du Chengbao

    1997-01-01

    The problems of communication between multiple single-chip computers and microcomputer have been solved by the way of hardware and software. The automatic fire alarm system is realized by using the serial port both on single-chip computer and microcomputer

  17. Concentrate composition for automatic milking systems - effect on milking frequency

    DEFF Research Database (Denmark)

    Madsen, Jørgen; Weisbjerg, Martin Riis; Hvelplund, Torben

    2010-01-01

    The purpose of this study was to investigate the potential of affecting milking frequency in an Automatic Milking System (AMS) by changing ingredient composition of the concentrate fed in the AMS. In six experiments, six experimental concentrates were tested against a Standard concentrate all sup...

  18. Experience in designing the automatic nuclear power plant control system

    International Nuclear Information System (INIS)

    Sedov, V.K.; Busygin, B.F.; Eliseeva, O.V.; Mikhajlov, V.A.

    1981-01-01

    The integrated automatic control system (ACS) is designed at the Novovoronezh NPP (NVNPP). It comprises automatic technological control of all the five power un+ts and the plant in the whole (ACST) and automatic organizational-economic production control system (ACSP). The NVNPP ACS is designed as a two-level system. The two M-4030 and M-4030-1 computers are the technical base of the upper layer while a set of block NPP (computer-M-60 and M-700 for unit 5; M-60 and SM-2 for units 1-4) of the lower level. Block diagram of the NVNPP ACS, flowsheet of NVNPP ACS technical means and external communications of the control centre are described. The NVNPP ACS is supposed to be put into operation by stages. It is noted that design and introduction of the typical NPP ACS at the NVNPP permits to maximally reduce in the future the period of developing automatic control systems at nly introduced units and NPPs with the WWER reactors [ru

  19. Automatic diagnosis and control of distributed solid state lighting systems

    NARCIS (Netherlands)

    Dong, J.; Van Driel, W.; Zhnag, G.

    2011-01-01

    This paper describes a new design concept of automatically diagnosing and compensating LED degradations in distributed solid state lighting (SSL) systems. A failed LED may significantly reduce the overall illumination level, and destroy the uniform illumination distribution achieved by a nominal

  20. Building an Image-Based System to automatically Score psoriasis

    DEFF Research Database (Denmark)

    G{'o}mez, D. Delgado; Carstensen, Jens Michael; Ersbøll, Bjarne Kjær

    2003-01-01

    the images. The system is tested on patients with the dermatological disease psoriasis. Temporal series of images are taken for each patient and the lesions are automatically extracted. Results indicate that to the images obtained are a good source for obtaining derived variables to track the lesion....

  1. Automatic frequency control system for driving a linear accelerator

    International Nuclear Information System (INIS)

    Helgesson, A.L.

    1976-01-01

    An automatic frequency control system is described for maintaining the drive frequency applied to a linear accelerator to produce maximum particle output from the accelerator. The particle output amplitude is measured and the frequency of the radio frequency source powering the linear accelerator is adjusted to maximize particle output amplitude

  2. Automatic detection, segmentation and assessment of snoring from ambient acoustic data.

    Science.gov (United States)

    Duckitt, W D; Tuomi, S K; Niesler, T R

    2006-10-01

    Snoring is a prevalent condition with a variety of negative social effects and associated health problems. Treatments, both surgical and therapeutic, have been developed, but the objective non-invasive monitoring of their success remains problematic. We present a method which allows the automatic monitoring of snoring characteristics, such as intensity and frequency, from audio data captured via a freestanding microphone. This represents a simple and portable diagnostic alternative to polysomnography. Our system is based on methods that have proved effective in the field of speech recognition. Hidden Markov models (HMMs) were employed as basic elements with which to model different types of sound by means of spectrally based features. This allows periods of snoring to be identified, while rejecting silence, breathing and other sounds. Training and test data were gathered from six subjects, and annotated appropriately. The system was tested by requiring it to automatically classify snoring sounds in new audio recordings and then comparing the result with manually obtained annotations. We found that our system was able to correctly identify snores with 82-89% accuracy, despite the small size of the training set. We could further demonstrate how this segmentation can be used to measure the snoring intensity, snoring frequency and snoring index. We conclude that a system based on hidden Markov models and spectrally based features is effective in the automatic detection and monitoring of snoring from audio data.

  3. Applications Of A Low Cost System For Industrial Automatic Inspection

    Science.gov (United States)

    Krey, C.; Ayache, A.; Bruel, A.

    1987-05-01

    In industrial environment, some repetitive tasks wich do not need a high degree of understanding, can be solved automatically owing to Vision. Among the systems available on the market, most of them are rather expensive with various capabilities. The described system is a modular system, built with some standard circuit boards. One of the advantages of this system is that its architecture can be redefined for each application, by assembling judiciously the standard modules. The vision system has been used successfully to sort fruits according to their colour and diameter. The system can sort 8 fruits per second on each sorting line and manage simultaneously up to 16 lines. An application of sheep skin cutting has been implemented too. After chemical and mechanical treatments, the skins present many defaults all around their contour, that must be cut off. A movable camera follows and inspects the contour ; the vision system determines where the cutting device must cut the skin. A third application has been implemented ; it concerns automatic recording and reproduction of logotypes. A moving camera driven by the system picks up the points, of the logotype contours. Before reproduction, programs can modify the logotypes shape, change the scale, and so on. For every application, the system uses the world smallest CCD camera developped in the laboratory. The small dimensions of the vision system and its low cost are major advantages for a wide use in industrial automatic inspection.

  4. Developing Automatic Student Motivation Modeling System

    Science.gov (United States)

    Destarianto, P.; Etikasari, B.; Agustianto, K.

    2018-01-01

    Achievement motivation is one of the internal factors in encouraging a person to perform the best activity in achieving its goals. The importance of achievement motivation must be possessed as an incentive to compete so that the person will always strive to achieve success and avoid failure. Based on this, the system is developed to determine the achievement motivation of students, so that students can do self-reflection in improving achievement motivation. The test results of the system using Naïve Bayes Classifier showed an average rate of accuracy of 91,667% in assessing student achievement motivation. By modeling the students ‘motivation generated by the system, students’ achievement motivation level can be known. This class of motivation will be used to determine appropriate counseling decisions, and ultimately is expected to improve student achievement motivation.

  5. EBT data acquisition and analysis system

    International Nuclear Information System (INIS)

    Burris, R.D.; Greenwood, D.E.; Stanton, J.S.; Geoffroy, K.A.

    1980-10-01

    This document describes the design and implementation of a data acquisition and analysis system for the EBT fusion experiment. The system includes data acquisition on five computers, automatic transmission of that data to a large, central data base, and a powerful data retrieval system. The system is flexible and easy to use, and it provides a fully documented record of the experiments

  6. Evaluation of automatic exposure control systems in computed tomography

    International Nuclear Information System (INIS)

    Reina, Thamiris Rosado

    2014-01-01

    The development of the computed tomography (CT) technology has brought wider possibilities on diagnostic medicine. It is a non-invasive method to see the human body in details. As the CT application increases, it raises the concern about patient dose, because the higher dose levels imparted compared to other diagnostic imaging modalities. The radiology community (radiologists, medical physicists and manufacturer) are working together to find the lowest dose level possible, without compromising the diagnostic image quality. The greatest and relatively new advance to lower the patient dose is the automatic exposure control (AEC) systems in CT. These systems are designed to ponder the dose distribution along the patient scanning and between patients taking into account their sizes and irradiated tissue densities. Based on the CT scanning geometry, the AEC-systems are very complex and their functioning is yet not fully understood. This work aims to evaluate the clinical performance of AEC-systems and their susceptibilities to assist on possible patient dose optimizations. The approach to evaluate the AEC-systems of three of the leading CT manufacturers in Brazil, General Electric, Philips and Toshiba, was the extraction of tube current modulation data from the DICOM standard image sequences, measurement and analysis of the image noise of those image sequences and measurement of the dose distribution along the scan length on the surface and inside of two different phantoms configurations. The tube current modulation of each CT scanner associated to the resulted image quality provides the performance of the AECsystem. The dose distribution measurements provide the dose profile due to the tube current modulation. Dose measurements with the AEC-system ON and OFF were made to quantify the impact of these systems regarding patient dose. The results attained give rise to optimizations on the AEC-systems applications and, by consequence, decreases the patient dose without

  7. An automatic maintenance system for nuclear power plants instrumentation

    OpenAIRE

    Álvarez Torres, María Bárbara; Iborra García, Andrés José; Fernández Andrés, José Carlos

    2000-01-01

    Maintenance and testing of reactor protection systems is an important cause of unplanned reactor trips due to be commonly carried out in manual mode. The execution of surveillance procedures in this mode entails a great number of manual operations. Automated testing is the answer because it minimises test times and reduces the risk of human errors. GAMA-I is an automatic system for testing the reactor protection instrumentation which is based on VXI instrumentation cards. This system has i...

  8. Building Research Capacity: Results of a Feasibility Study Using a Novel mHealth Epidemiological Data Collection System Within a Gestational Diabetes Population.

    Science.gov (United States)

    McLean, Allen; Osgood, Nathaniel; Newstead-Angel, Jill; Stanley, Kevin; Knowles, Dylan; van der Kamp, William; Qian, Weicheng; Dyck, Roland

    2017-01-01

    Public health researchers have traditionally relied on individual self-reporting when collecting much epidemiological surveillance data. Data acquisition can be costly, difficult to acquire, and the data often notoriously unreliable. An interesting option for the collection of individual health (or indicators of individual health) data is the personal smartphone. Smartphones are ubiquitous, and the required infrastructure is well-developed across Canada, including many remote areas. Researchers and health professionals are asking themselves how they might exploit increasing smartphone uptake for the purposes of data collection, hopefully leading to improved individual and public health. A novel smartphone-based epidemiological data collection and analysis system has been developed by faculty and students from the CEPHIL (Computational Epidemiology and Public Health Informatics) Lab in the Department of Computer Science at the University of Saskatchewan. A pilot feasibility study was then designed to examine possible relationships between smartphone sensor data, surveys and individual clinical data within a population of pregnant women. The study focused on the development of Gestational Diabetes (GDM), a transient condition during pregnancy, but with serious potential post-birth complications for both mother and child. The researchers questioned whether real-time smartphone data could improve the clinical management and outcomes of women at risk for developing GDM, enabling earlier treatment. The initial results from this small study did not show improved prediction of GDM, but did demonstrate that real-time individual health and sensor data may be readily collected and analyzed efficiently while maintaining confidentiality. Because the original version of the data collection software could only run on Android phones, this often meant the study participants were required to carry two phones, and this often meant the study phone was not carried, and therefore data

  9. An automatic multichannel generalized system for frequency measurement

    Directory of Open Access Journals (Sweden)

    Gomah G.

    2015-01-01

    Full Text Available Monitoring the performance of the primary frequency sources continuously through comparing it versus the transfer standards and the calibration of the secondary frequency sources periodically versus the primary ones are of the main missions assigned to any time and frequency laboratory either it was a calibration laboratory or a national metrological laboratory. An automatic Generalized System (GS for monitoring/calibrating any frequency source that has a Relative Frequency Offset (RFO greater than 1 × 10-14Hz/Hz has been built. This GS is able to use either of two measurement methods according to the accuracy of the frequency source being measured. A graphical programming language, which is Labview, was used in writing the software required for both hardware control and data logging. So, the software can be easily reconfigured for any upgrading plans. Also a flexible arrangement for the hardware setup was used such that two measurement systems are merged in one system. So, according to the user needs the right hardware setup can be chosen. The results obtained by this GS were verified through comparing them to those generated by one of the commercial turnkey solutions.

  10. Requirements to a Norwegian National Automatic Gamma Monitoring System

    International Nuclear Information System (INIS)

    Lauritzen, B.; Hedemann Jensen, P.; Nielsen, F.

    2005-04-01

    An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of the system in terms of ambient dose equivalent rate increments above the natural background levels. The study is based upon simplified deterministic calculations of the radiological consequences of generic nuclear accident scenarios. The density of gamma monitoring stations has been estimated from an analysis of the dispersion of radioactive materials over large distances using historical weather data; the minimum density is estimated from the requirement that a radioactive plume may not slip unnoticed in between stations of the monitoring network. The sensitivity of the gamma monitoring system is obtained from the condition that events that may require protective intervention measures should be detected by the system. Action levels for possible introduction of sheltering and precautionary foodstuff restrictions are derived in terms of ambient dose equivalent rate. For emergency situations where particulates contribute with only a small fraction of the total ambient dose equivalent rate from the plume, it is concluded that measurements of dose rate are sufficient to determine the need for sheltering; simple dose rate measurements however, are inadequate to determine the need for foodstuff restrictions and spectral measurements are required. (au)

  11. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  12. Automatic outdoor monitoring system for photovoltaic panels.

    Science.gov (United States)

    Stefancich, Marco; Simpson, Lin; Chiesa, Matteo

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  13. Automatic outdoor monitoring system for photovoltaic panels

    Energy Technology Data Exchange (ETDEWEB)

    Stefancich, Marco [Consiglio Nazionale delle Ricerce, Istituto dei Materiali per l’Elettronica ed il Magnetismo (CNR-IMEM), Parco Area delle Scienze 37/A, 43124 Parma, Italy; Simpson, Lin [National Renewable Energy Laboratory, 15013 Denver West Parkway, Golden, Colorado 80401, USA; Chiesa, Matteo [Masdar Institute of Science and Technology, P.O. Box 54224, Masdar City, Abu Dhabi, United Arab Emirates

    2016-05-01

    Long-term acquisition of solar panel performance parameters, for panels operated at maximum power point in their real environment, is of critical importance in the photovoltaic research sector. However, few options exist for the characterization of non-standard panels such as concentrated photovoltaic systems, heavily soiled or shaded panels or those operating under non-standard spectral illumination; certainly, it is difficult to find such a measurement system that is flexible and affordable enough to be adopted by the smaller research institutes or universities. We present here an instrument aiming to fill this gap, autonomously tracking and maintaining any solar panel at maximum power point while continuously monitoring its operational parameters and dissipating the produced energy without connection to the power grid. The instrument allows periodic acquisition of current-voltage curves to verify the employed maximum power point tracking approach. At the same time, with hardware schematics and software code being provided, it provides a flexible open development environment for the monitoring of non-standard generators like concentrator photovoltaic systems and to test novel power tracking approaches. The key issues, and the corresponding solutions, encountered in the design are analyzed in detail and the relevant schematics presented.

  14. ATIPS: Automatic Travel Itinerary Planning System for Domestic Areas

    Science.gov (United States)

    2016-01-01

    Leisure travel has become a topic of great interest to Taiwanese residents in recent years. Most residents expect to be able to relax on a vacation during the holidays; however, the complicated procedure of travel itinerary planning is often discouraging and leads them to abandon the idea of traveling. In this paper, we design an automatic travel itinerary planning system for the domestic area (ATIPS) using an algorithm to automatically plan a domestic travel itinerary based on user intentions that allows users to minimize the process of trip planning. Simply by entering the travel time, the departure point, and the destination location, the system can automatically generate a travel itinerary. According to the results of the experiments, 70% of users were satisfied with the result of our system, and 82% of users were satisfied with the automatic user preference learning mechanism of ATIPS. Our algorithm also provides a framework for substituting modules or weights and offers a new method for travel planning. PMID:26839529

  15. New Knowledge Management Systems: The Implications for Data Discovery, Collection Development, and the Changing Role of the Librarian.

    Science.gov (United States)

    Stern, David

    2003-01-01

    Discusses questions to consider as chemistry libraries develop new information storage and retrieval systems. Addresses new integrated tools for data manipulation that will guarantee access to information; differential pricing and package plans and effects on libraries' budgeting; and the changing role of the librarian. (LRW)

  16. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  17. Software Sub-system in Loading Automatic Test System for the Measurement of Power Line Filters

    Directory of Open Access Journals (Sweden)

    Yu Bo

    2017-01-01

    Full Text Available The loading automatic test system for measurement of power line filters are in urgent demand. So the software sub-system of the whole test system was proposed. Methods: structured the test system based on the virtual instrument framework, which consisted of lower and up computer and adopted the top down approach of design to perform the system and its modules, according to the measurement principle of the test system. Results: The software sub-system including human machine interface, data analysis and process software, expert system, communication software, control software in lower computer, etc. had been designed. Furthermore, it had been integrated into the entire test system. Conclusion: This sub-system provided a fiendly software platform for the whole test system, and had many advantages such as strong functions, high performances, low prices. It not only raises the test efficiency of EMI filters, but also renders some creativities.

  18. Evaluation of an automatic uranium titration system

    International Nuclear Information System (INIS)

    Lewis, K.

    1980-01-01

    The titration system utilizes the constant current coulometric titration of Goldbeck and Lerner. U(VI) is reduced to U(IV) by Fe(II). V(V) is generated to titrate the U(IV), and the titration is followed potentiometrically. The evaluation shows that the recovery of uranium is 100% at the 40-mg level. The accuracy is generally +-0.10% or better. The smallest sample weight at which reliable results were obtained was 40 mg of uranium. Time for one analysis is 15 minutes. Advantages and disadvantages of the automated titrator are listed

  19. Conceptual design of novel IP-conveyor-belt Weissenberg-mode data-collection system with multi-readers for macromolecular crystallography. A comparison between Galaxy and Super Galaxy.

    Science.gov (United States)

    Sakabe, N; Sakabe, K; Sasaki, K

    2004-01-01

    Galaxy is a Weissenberg-type high-speed high-resolution and highly accurate fully automatic data-collection system using two cylindrical IP-cassettes each with a radius of 400 mm and a width of 450 mm. It was originally developed for static three-dimensional analysis using X-ray diffraction and was installed on bending-magnet beamline BL6C at the Photon Factory. It was found, however, that Galaxy was also very useful for time-resolved protein crystallography on a time scale of minutes. This has prompted us to design a new IP-conveyor-belt Weissenberg-mode data-collection system called Super Galaxy for time-resolved crystallography with improved time and crystallographic resolution over that achievable with Galaxy. Super Galaxy was designed with a half-cylinder-shaped cassette with a radius of 420 mm and a width of 690 mm. Using 1.0 A incident X-rays, these dimensions correspond to a maximum resolutions of 0.71 A in the vertical direction and 1.58 A in the horizontal. Upper and lower screens can be used to set the frame size of the recorded image. This function is useful not only to reduce the frame exchange time but also to save disk space on the data server. The use of an IP-conveyor-belt and many IP-readers make Super Galaxy well suited for time-resolved, monochromatic X-ray crystallography at a very intense third-generation SR beamline. Here, Galaxy and a conceptual design for Super Galaxy are described, and their suitability for use as data-collection systems for macromolecular time-resolved monochromatic X-ray crystallography are compared.

  20. 'H-Bahn' - Dortmund demonstration system. Automatic vehicle protection system

    Energy Technology Data Exchange (ETDEWEB)

    Rosenkranz

    1984-01-01

    The automatic vehicle protection system of the H-Bahn at the Universtiy of Dortmund is responsible for fail-safe operating of the automatic vehicles. Its functions are protection of vehicle operation and protection of passengers boarding and leaving the vehicles. These functions are managed decentrally by two fail-safe operating controllers. Besides the well-known relay-techniques of railway-fail-safe systems, electronics are applied which are based on safe operating URTL-microcontrollers. These are controlled by software stored in EPROMs. A connection link using glass-fibres serves for safe data-exchange between the two fail-safe operating controllers. The experts' favourable reports on 'train protection and safety during passenger processing' were completed in March 84; thus, transportation of passengers could start in April 84.

  1. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  2. Library Data Collection in Brazil.

    Science.gov (United States)

    Figueiredo, Nice

    1988-01-01

    Reviews the role played by the International Federation of Libraries Association, International Standards Organization, and UNESCO in the establishment of international standards for library statistics. The Brazilian literature on the collection of library statistics is then analyzed to evaluate the extent to which such standards have been…

  3. Longitudinal automatic control system for a light weight aircraft

    Directory of Open Access Journals (Sweden)

    Cristian VIDAN

    2016-12-01

    Full Text Available This paper presents the design of an automatic control system for longitudinal axis of a light weight aircraft. To achieve this goal it is important to start from the mathematical model in longitudinal plane and then to determine the steady-state parameters for a given velocity and altitude. Using MATLAB Software the mathematical model in longitudinal plane was linearized and the system transfer functions were obtained. To determine the automatic control design we analyzed the stability of the linearized model for each input. After the stability problem was solved, using MATLAB-Simulink Software we designed the control system architecture and we considered that the objective for a stable flight was to continuously adjust the pitch angle θ through control of elevator and velocity through control of the throttle. Finally, we analyzed the performance of the designed longitudinal control system and the results highlighted in graphs outline that the purpose for which it was designed was fulfilled.

  4. Study on traffic accidents mechanism with automatic recording systems. Part 2. Application of data from ADR and DMR for practical driver education; Jidosha kiroku sochi ni yoru kotsu jiko hassei mechanism no kenkyu. 2. Jiko data kirokukei (ADR) to unko kirokukei (DMR) no untensha kyoiku eno katsuyo

    Energy Technology Data Exchange (ETDEWEB)

    Ueyama, M.; Ogawa, S. [National Research Inst. of Police Science, Tokyo (Japan); Chikasue, H.; Muramatsu, K. [Yazaki Meter Co. Ltd., Tokyo (Japan)

    1997-10-01

    A field trial are carried out using automatic receding system; ADR (Accident Data Recorder) and DMR (Driving Monitoring Recorder) installed on 20 commercial vehicles, in order to assess the implications for driver behavior and accidents. The data suggest that the accident mechanism can be explained in terms of situation-specific factor and behavior of drivers just before accident that is, their attitude to the handing and control of vehicles. The data might offer a new information for practical driver education. 3 refs., 9 figs., 1 tab.

  5. The anemodata 1-IIE. Automatic system for wind data acquisition; El anemodata 1-IIE. Sistema automatico para la adquisicion de datos de viento

    Energy Technology Data Exchange (ETDEWEB)

    Borja, Marco Antonio; Parkman Cuellar, Pablo A. [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)

    1986-12-31

    Wind is an inexhaustible energy source. To study its behavior in order to develop research projects and apply new technologies connected to its maximum development is one of the activities carried on at the Instituto de Investigaciones Electricas (IIE). As a part of such activities, the equipment Anemodata-1-IIE was designed and built for the wind velocity and direction data acquisition. The Anemodata-1-IIE is the result of the work that the Departamento de Fuentes no Convencionales (Non-Conventional Energy Sources of the Energy Sources Department) carries on regarding the development of electric equipment for the anemometry. [Espanol] Una fuente inagotable de energia es el viento. Estudiar su comportamiento para desarrollar proyectos de investigacion y aplicar nuevas tecnologias vinculadas con su maximo aprovechamiento es una de las actividades que se realizan en el Instituto de Investigaciones Electricas (IIE). Como parte de dichas actividades, se diseno y construyo el equipo Anemodata-1-IIE para la adquisicion de datos de velocidad y direccion del viento. El anemodata-1-IIE es un resultado de los trabajos que el Departamento de Fuentes no Convencionales, de la division de Fuentes de Energia, lleva a cabo en torno al desarrollo de equipo electrico para anemometria.

  6. Construction of an Automatic Drawing System for Power System Diagram by Using GA

    Science.gov (United States)

    Kawahara, Koji; Zoka, Yoshifumi; Sasaki, Hiroshi

    In case of analyzing power system by numerical calculation for power flow and transient stability, it is very useful to make a power system diagram to ease the modification and the change of data for system analysis. However, making the system diagram has been so far done by man power. It not only takes a lot of time to make a drawing for large-scale power system, it but also may cause a mistake. This paper proposes a supporting system for drawing the system diagram by applying a method for the automatic placement of nodes using genetic algorithm. We construct a prototype system available for not only automatic drawing but also editing and modifying diagrams. In addition, since we assume that the proposed system utilizes computation resources on Internet, the proposed system is expected to run on different operating systems through graphical user interface (GUI). To realize this, a prototype system is developed by JAVA language and is designed by design pattern technique that enables to record, reuse, and represent recurring design structures and associated design experience.

  7. Water quality, meteorological, and nutrient data collected by the the National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) from January 1, 1995 to August 1, 2011 (NODC Accession 0052765)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Estuarine Research Reserve System's System-wide Monitoring Program (NERRS SWMP) collected water quality, meteorological, and nutrient data in 26...

  8. The Development of Automatic Sequences for the RF and Cryogenic Systems at the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Gurd, Pamela; Casagrande, Fabio; Mccarthy, Michael; Strong, William; Ganni, Venkatarao

    2005-01-01

    Automatic sequences both ease the task of operating a complex machine and ensure procedural consistency. At the Spallation Neutron Source project (SNS), a set of automatic sequences have been developed to perform the start up and shut down of the high power RF systems. Similarly, sequences have been developed to perform backfill, pump down, automatic valve control and energy management in the cryogenic system. The sequences run on Linux soft input-output controllers (IOCs), which are similar to ordinary EPICS (Experimental Physics and Industrial Control System) IOCs in terms of data sharing with other EPICS processes, but which share a Linux processor with other such processors. Each sequence waits for a command from an operator console and starts the corresponding set of instructions, allowing operators to follow the sequences either from an overview screen or from detail screens. We describe each system and our operational experience with it.

  9. Design of automatic tracking system for electron beam welding

    International Nuclear Information System (INIS)

    He Chengdan; Chinese Academy of Space Technology, Lanzhou; Li Heqi; Li Chunxu; Ying Lei; Luo Yan

    2004-01-01

    The design and experimental process of an automatic tracking system applied to local vacuum electron beam welding are dealt with in this paper. When the annular parts of an exactitude apparatus were welded, the centre of rotation of the electron gun and the centre of the annular weld are usually not superposed because of the machining error, workpiece's setting error and so on. In this teaching process, a little bundle of electron beam is used to scan the weld groove, the amount of the secondary electrons reflected from the workpiece is different when the electron beam scans the both sides and the centre of the weld groove. The difference can indicate the position of the weld and then a computer will record the deviation between the electron beam spot and the centre of the weld groove. The computer will analyze the data and put the data into the storage software. During the welding process, the computer will modify the position of the electron gun based on the deviation to make the electron beam spot centered on the annular weld groove. (authors)

  10. Chemical Data Reporting - Previously Collected Data

    Science.gov (United States)

    EPA now refers to the Inventory Update Reporting (IUR) rule as the Chemical Data Reporting (CDR) Rule. This change was effective with the publication of the Inventory Update Reporting Modifications; Chemical Data Reporting Final Rule in August 2011.

  11. Robust parameter design for automatically controlled systems and nanostructure synthesis

    Science.gov (United States)

    Dasgupta, Tirthankar

    2007-12-01

    This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor

  12. Methods for using groundwater model predictions to guide hydrogeologic data collection, with application to the Death Valley regional groundwater flow system

    Science.gov (United States)

    Tiedeman, C.R.; Hill, M.C.; D'Agnese, F. A.; Faunt, C.C.

    2003-01-01

    Calibrated models of groundwater systems can provide substantial information for guiding data collection. This work considers using such models to guide hydrogeologic data collection for improving model predictions by identifying model parameters that are most important to the predictions. Identification of these important parameters can help guide collection of field data about parameter values and associated flow system features and can lead to improved predictions. Methods for identifying parameters important to predictions include prediction scaled sensitivities (PSS), which account for uncertainty on individual parameters as well as prediction sensitivity to parameters, and a new "value of improved information" (VOII) method presented here, which includes the effects of parameter correlation in addition to individual parameter uncertainty and prediction sensitivity. In this work, the PSS and VOII methods are demonstrated and evaluated using a model of the Death Valley regional groundwater flow system. The predictions of interest are advective transport paths originating at sites of past underground nuclear testing. Results show that for two paths evaluated the most important parameters include a subset of five or six of the 23 defined model parameters. Some of the parameters identified as most important are associated with flow system attributes that do not lie in the immediate vicinity of the paths. Results also indicate that the PSS and VOII methods can identify different important parameters. Because the methods emphasize somewhat different criteria for parameter importance, it is suggested that parameters identified by both methods be carefully considered in subsequent data collection efforts aimed at improving model predictions.

  13. Quality assurance and data collection -- Electronic Data Transfer

    International Nuclear Information System (INIS)

    Tomczak, L.M.; Lohner, W.G.; Ray, E.C.; Salesky, J.A.; Spitz, H.B.

    1993-05-01

    The Radiological Environmental Monitoring (REM) group at the Fernald Environmental Management Project is involved in an Electronic Data Transfer practice that will result in the improved quality assurance of collected data. This practice focuses on electronic data transfer from the recording instrument to reduce the manpower normally required for manual data entry and improve the quality of the data transferred. The application of this practice can enhance any data collection program where instruments with electronic memories and a signal output are utilized. Organizations employing this practice can strengthen the quality and efficiency of their data collection program. The use of these practices can assist in complying with Quality Assurance requirements under ASME NQA-1, RCRA, CERCLA, and DOE Order activities. Data from Pylon AB-5 instrumentation is typically configured to print data to a tape. The REM group has developed a process to electronically transfer stored data. The data are sent from the Pylon AB-5 field instrument to a HewlettPackard portable hand computer, model HP95LX. Data are recorded and stored on a 128 K-byte RAN card and later transferred to a PC database as an electronic file for analysis. The advantage of this system is twofold: (1) Data entry errors are eliminated and (2) considerable data collection and entry time is eliminated. Checks can then be conducted for data validity between recorded intervals due to light leaks etc. and the detection of outliers. This paper will discuss the interface and connector components that allow this transfer of data from the Pylon to the PC to take place and the process to perform that activity

  14. Pregnancy outcomes in women with mechanical prosthetic heart valves: a prospective descriptive population based study using the United Kingdom Obstetric Surveillance System (UKOSS) data collection system.

    Science.gov (United States)

    Vause, S; Clarke, B; Tower, C L; Hay, Crm; Knight, M

    2017-08-01

    To describe the incidence of mechanical prosthetic heart valves (MPHV) in pregnancy in the UK; rates of maternal and fetal complications in this group of women, and whether these vary with the anticoagulation used during pregnancy. Prospective descriptive population-based study. All consultant-led maternity units in the UK. All women with an MPHV who were pregnant between 1 February 2013 and 31 January 2015. Collection and analysis of anonymous data relating to pregnancy management and outcome, using the UKOSS notification and data collection system. Maternal death, serious maternal morbidity, poor fetal outcome. Data were obtained for 58 women giving an estimated incidence of 3.7 (95% CI 2.7-4.7) per 100 000 maternities. There were five maternal deaths (9%); a further 24 (41%) suffered serious maternal morbidity. There was a poor fetal outcome from 26 (47%) pregnancies. Only 16 (28%) women had a good maternal and good fetal outcome. Low-molecular-weight heparin (LMWH) was used throughout pregnancy by 71% of women. Of these, 83% required rapid dose escalation in the first trimester. Monitoring regimens lacked consistency. This study has estimated the incidence of MPHV in pregnant women in the UK. It includes the largest cohort managed with LMWH throughout pregnancy reported to date. It demonstrates a high rate of maternal death, and serious maternal and fetal morbidity. Women with MPHVs, and their clinicians need to appreciate the significant maternal and fetal risks involved in pregnancy. Care should be concentrated in specialist centres. High rates of poor maternal and fetal outcomes in pregnant women with mechanical prosthetic heart valves. © 2016 Royal College of Obstetricians and Gynaecologists.

  15. Meeting Expanding Needs to Collect Food Intake Specificity: The Nutrition Data System for Research (NDS-R)

    Science.gov (United States)

    VanHeel, Nancy; Pettit, Janet; Rice, Barbara; Smith, Scott M.

    2003-01-01

    Food and nutrient databases are populated with data obtained from a variety of sources including USDA Reference Tables, scientific journals, food manufacturers and foreign food tables. The food and nutrient database maintained by the Nutrition Coordinating Center (NCC) at the University of Minnesota is continually updated with current nutrient data and continues to be expanded with additional nutrient fields to meet diverse research endeavors. Data are strictly evaluated for reliability and relevance before incorporation into the database; however, the values are obtained from various sources and food samples rather than from direct chemical analysis of specific foods. Precise nutrient values for specific foods are essential to the nutrition program at the National Aeronautics and Space Administration (NASA). Specific foods to be included in the menus of astronauts are chemically analyzed at the Johnson Space Center for selected nutrients. A request from NASA for a method to enter the chemically analyzed nutrient values for these space flight food items into the Nutrition Data System for Research (NDS-R) software resulted in modification of the database and interview system for use by NASA, with further modification to extend the method for related uses by more typical research studies.

  16. An automatic beam focusing system for MeV protons

    Science.gov (United States)

    Udalagama, C. N. B.; Bettiol, A. A.; van Kan, J. A.; Teo, E. J.; Breese, M. B. H.; Osipowicz, T.; Watt, F.

    2005-04-01

    An automatic focusing system for MeV protons has been developed. The focusing system utilises rapid real time proton induced secondary electron imaging of a calibration grid coupled with a modified Gaussian fit in order to take into account the enhanced secondary electron signal from the calibration grid edge. The focusing system has been successfully applied to MeV protons focused using a coupled triplet configuration of magnetic quadrupole lenses (Oxford triplet). Automatic beam focusing of a coarse beamspot of approximately (5 × 3.5) micrometres in the X and Y directions to a sub-micrometre beamspot of approximately (0.7 × 0.6) micrometers was achieved at a beam current of about 50 pA.

  17. An automatic beam focusing system for MeV protons

    International Nuclear Information System (INIS)

    Udalagama, C.N.B.; Bettiol, A.A.; Kan, J.A. van; Teo, E.J.; Breese, M.B.H.; Osipowicz, T.; Watt, F.

    2005-01-01

    An automatic focusing system for MeV protons has been developed. The focusing system utilises rapid real time proton induced secondary electron imaging of a calibration grid coupled with a modified Gaussian fit in order to take into account the enhanced secondary electron signal from the calibration grid edge. The focusing system has been successfully applied to MeV protons focused using a coupled triplet configuration of magnetic quadrupole lenses (Oxford triplet). Automatic beam focusing of a coarse beamspot of approximately (5 x 3.5) micrometres in the X and Y directions to a sub-micrometre beamspot of approximately (0.7 x 0.6) micrometers was achieved at a beam current of about 50 pA

  18. Development of an Automatic Dispensing System for Traditional Chinese Herbs

    Directory of Open Access Journals (Sweden)

    Chi-Ying Lin

    2017-01-01

    Full Text Available The gathering of ingredients for decoctions of traditional Chinese herbs still relies on manual dispensation, due to the irregular shape of many items and inconsistencies in weights. In this study, we developed an automatic dispensing system for Chinese herbal decoctions with the aim of reducing manpower costs and the risk of mistakes. We employed machine vision in conjunction with a robot manipulator to facilitate the grasping of ingredients. The name and formulation of the decoction are input via a human-computer interface, and the dispensing of multiple medicine packets is performed automatically. An off-line least-squared curve fitting method was used to calculate the amount of material grasped by the claws and thereby improve system efficiency as well as the accuracy of individual dosages. Experiments on the dispensing of actual ingredients demonstrate the feasibility of the proposed system.

  19. 3D anthropometric data collection

    NARCIS (Netherlands)

    Daanen, H.A.M.; Nennie, F.A.; Rioux, Marc

    2007-01-01

    The first whole body scanners emerged in 1995. In 1999 a review of whole body scanning techniques and systems was presented (Daanen, H.A.M., Van de Water, G.J. Whole body scanners. Displays 19: 111-120). Now, eight years later, we will present an update of available systems including software and

  20. Truck Roll Stability Data Collection and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, SS

    2001-07-02

    The principal objective of this project was to collect and analyze vehicle and highway data that are relevant to the problem of truck rollover crashes, and in particular to the subset of rollover crashes that are caused by the driver error of entering a curve at a speed too great to allow safe completion of the turn. The data are of two sorts--vehicle dynamic performance data, and highway geometry data as revealed by vehicle behavior in normal driving. Vehicle dynamic performance data are relevant because the roll stability of a tractor trailer depends both on inherent physical characteristics of the vehicle and on the weight and distribution of the particular cargo that is being carried. Highway geometric data are relevant because the set of crashes of primary interest to this study are caused by lateral acceleration demand in a curve that exceeds the instantaneous roll stability of the vehicle. An analysis of data quality requires an evaluation of the equipment used to collect the data because the reliability and accuracy of both the equipment and the data could profoundly affect the safety of the driver and other highway users. Therefore, a concomitant objective was an evaluation of the performance of the set of data-collection equipment on the truck and trailer. The objective concerning evaluation of the equipment was accomplished, but the results were not entirely positive. Significant engineering apparently remains to be done before a reliable system can be fielded. Problems were identified with the trailer to tractor fiber optic connector used for this test. In an over-the-road environment, the communication between the trailer instrumentation and the tractor must be dependable. In addition, the computer in the truck must be able to withstand the rigors of the road. The major objective--data collection and analysis--was also accomplished. Using data collected by instruments on the truck, a ''bad-curve'' database can be generated. Using

  1. Automatic Modelling of Rubble Mound Breakwaters from LIDAR Data

    Science.gov (United States)

    Bueno, M.; Díaz-Vilariño, L.; González-Jorge, H.; Martínez-Sánchez, J.; Arias, P.

    2015-08-01

    Rubble mound breakwaters maintenance is critical to the protection of beaches and ports. LiDAR systems provide accurate point clouds from the emerged part of the structure that can be modelled to make it more useful and easy to handle. This work introduces a methodology for the automatic modelling of breakwaters with armour units of cube shape. The algorithm is divided in three main steps: normal vector computation, plane segmentation, and cube reconstruction. Plane segmentation uses the normal orientation of the points and the edge length of the cube. Cube reconstruction uses the intersection of three perpendicular planes and the edge length. Three point clouds cropped from the main point cloud of the structure are used for the tests. The number of cubes detected is around 56 % for two of the point clouds and 32 % for the third one over the total physical cubes. Accuracy assessment is done by comparison with manually drawn cubes calculating the differences between the vertexes. It ranges between 6.4 cm and 15 cm. Computing time ranges between 578.5 s and 8018.2 s. The computing time increases with the number of cubes and the requirements of collision detection.

  2. Assessment of retrofit automatic vent dampers for residential heating systems

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, D.L.; Wilson, R.P. Jr.; Ashley, L.E.; Butterfield, J.F.

    1977-11-01

    Automatic vent dampers are devices installed in the exhaust vent of a central heating system which prohibit the chimney flow of warm air from the dwelling space and from within the furnace when the heating system is not operating. An investigation of the effect of thermally actuated or electrically actuated dampers on home energy conservation, their cost, and safety is described. Eleven heating system types in 2 geographic regions were used in this study. It was determined that good quality, safe electrically actuated dampers are available in the U.S. and that thermally actuated units will be available soon; an average savings of approximately 8% in home heating cost could be achieved by using automatic dampers with suitable furnace systems in regions with a heating season of more than 4000 degree-days; the cost of the automatic dampers is from $65 to $140 with a payback period of 3 to 4 1/2 y; and, with the average heating system, vent damper retrofit alone is not as an attractive energy conservation option as combined vent damper, intermittent ignition device retrofit, and reduced gas orifice. (LCL)

  3. Automatic Indoor Building Reconstruction from Mobile Laser Scanning Data

    Science.gov (United States)

    Xie, L.; Wang, R.

    2017-09-01

    Indoor reconstruction from point clouds is a hot topic in photogrammetry, computer vision and computer graphics. Reconstructing indoor scene from point clouds is challenging due to complex room floorplan and line-of-sight occlusions. Most of existing methods deal with stationary terrestrial laser scanning point clouds or RGB-D point clouds. In this paper, we propose an automatic method for reconstructing indoor 3D building models from mobile laser scanning point clouds. The method includes 2D floorplan generation, 3D building modeling, door detection and room segmentation. The main idea behind our approach is to separate wall structure into two different types as the inner wall and the outer wall based on the observation of point distribution. Then we utilize a graph cut based optimization method to solve the labeling problem and generate the 2D floorplan based on the optimization result. Subsequently, we leverage an ?-shape based method to detect the doors on the 2D projected point clouds and utilize the floorplan to segment the individual room. The experiments show that this door detection method can achieve a recognition rate at 97% and the room segmentation method can attain the correct segmentation results. We also evaluate the reconstruction accuracy on the synthetic data, which indicates the accuracy of our method is comparable to the state-of-the art.

  4. Automatic detection of interictal spikes using data mining models.

    Science.gov (United States)

    Valenti, Pablo; Cazamajou, Enrique; Scarpettini, Marcelo; Aizemberg, Ariel; Silva, Walter; Kochen, Silvia

    2006-01-15

    A prospective candidate for epilepsy surgery is studied both the ictal and interictal spikes (IS) to determine the localization of the epileptogenic zone. In this work, data mining (DM) classification techniques were utilized to build an automatic detection model. The selected DM algorithms are: Decision Trees (J 4.8), and Statistical Bayesian Classifier (naïve model). The main objective was the detection of IS, isolating them from the EEG's base activity. On the other hand, DM has an attractive advantage in such applications, in that the recognition of epileptic discharges does not need a clear definition of spike morphology. Furthermore, previously 'unseen' patterns could be recognized by the DM with proper 'training'. The results obtained showed that the efficacy of the selected DM algorithms is comparable to the current visual analysis used by the experts. Moreover, DM is faster than the time required for the visual analysis of the EEG. So this tool can assist the experts by facilitating the analysis of a patient's information, and reducing the time and effort required in the process.

  5. Tritium monitor and collection system

    Science.gov (United States)

    Bourne, G.L.; Meikrantz, D.H.; Ely, W.E.; Tuggle, D.G.; Grafwallner, E.G.; Wickham, K.L.; Maltrud, H.R.; Baker, J.D.

    1992-01-14

    This system measures tritium on-line and collects tritium from a flowing inert gas stream. It separates the tritium from other non-hydrogen isotope contaminating gases, whether radioactive or not. The collecting portion of the system is constructed of various zirconium alloys called getters. These alloys adsorb tritium in any of its forms at one temperature and at a higher temperature release it as a gas. The system consists of four on-line getters and heaters, two ion chamber detectors, two collection getters, and two guard getters. When the incoming gas stream is valved through the on-line getters, 99.9% of it is adsorbed and the remainder continues to the guard getter where traces of tritium not collected earlier are adsorbed. The inert gas stream then exits the system to the decay chamber. Once the on-line getter has collected tritium for a predetermined time, it is valved off and the next on-line getter is valved on. Simultaneously, the first getter is heated and a pure helium purge is employed to carry the tritium from the getter. The tritium loaded gas stream is then routed through an ion chamber which measures the tritium activity. The ion chamber effluent passes through a collection getter that readsorbs the tritium and is removable from the system once it is loaded and is then replaced with a clean getter. Prior to removal of the collection getter, the system switches to a parallel collection getter. The effluent from the collection getter passes through a guard getter to remove traces of tritium prior to exiting the system. The tritium loaded collection getter, once removed, is analyzed by liquid scintillation techniques. The entire sequence is under computer control except for the removal and analysis of the collection getter. 7 figs.

  6. Midterm Report on Data Collection

    DEFF Research Database (Denmark)

    Gelsing, Lars; Linde, Lisbeth Tved

    In the MERIPA project this report concerns data availability in order to make future cluster and network analyses in the MERIPA regions. At the same time discussions about methodology are being started....

  7. Science data collection with polarimetric SAR

    DEFF Research Database (Denmark)

    Dall, Jørgen; Woelders, Kim; Madsen, Søren Nørvang

    1996-01-01

    Discusses examples on the use of polarimetric SAR in a number of Earth science studies. The studies are presently being conducted by the Danish Center for Remote Sensing. A few studies of the European Space Agency's EMAC programme are also discussed. The Earth science objectives are presented......, and the potential of polarimetric SAR is discussed and illustrated with data collected by the Danish airborne EMISAR system during a number of experiments in 1994 and 1995. The presentation will include samples of data acquired for the different studies...

  8. On Learning from Collective Data

    Science.gov (United States)

    2013-12-01

    BPMF on the full Netflix data. As the number of samples increase, the RMSE of Bayesian methods drop monotonically. The RMSE of the Netflix’s baseline and... Netflix data. The accuracy increases when more factors are used, and no over-fitting is observed. Also, BPTF with 20 factors achieves similar...performance as BPMF with 100 factors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 2.6 RMSE of PMF, BPMF, and BPTF. (a) On a subset of Netflix

  9. Research about an automatic timing count system based on LabView

    International Nuclear Information System (INIS)

    Yan Jie; Liu Rong; Jian Li; Lu Xinxin; Zhu Tonghua; Wang Mei; Wen Zhongwei; Lin Jufang; Li Cheng

    2009-01-01

    Based on the LabView Virtual Instrument Development Platform and the GPIB instrument control and data transmission bus protocol, the design and research of a virtual instrument about an automatic timing count system using ORTEC 974 Counter/Timer is introduced in this paper. Comparing with the real instrument, the virtual instrument system enriched the timing count function and carried out the remote control of the real instrument. The counts and measured time can be recorded automatically during the measurement process for the further analysis and processing. (authors)

  10. 46 CFR 161.002-8 - Automatic fire detecting systems, general requirements.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Automatic fire detecting systems, general requirements... Systems § 161.002-8 Automatic fire detecting systems, general requirements. (a) General. An automatic fire... combined with other power failure alarm systems when specifically approved. (b) [Reserved] [21 FR 9032, Nov...

  11. Automatic Number Plate Recognition System for IPhone Devices

    Directory of Open Access Journals (Sweden)

    Călin Enăchescu

    2013-06-01

    Full Text Available This paper presents a system for automatic number plate recognition, implemented for devices running the iOS operating system. The methods used for number plate recognition are based on existing methods, but optimized for devices with low hardware resources. To solve the task of automatic number plate recognition we have divided it into the following subtasks: image acquisition, localization of the number plate position on the image and character detection. The first subtask is performed by the camera of an iPhone, the second one is done using image pre-processing methods and template matching. For the character recognition we are using a feed-forward artificial neural network. Each of these methods is presented along with its results.

  12. Automatic road traffic safety management system in urban areas

    Directory of Open Access Journals (Sweden)

    Oskarbski Jacek

    2017-01-01

    Full Text Available Traffic incidents and accidents contribute to decreasing levels of transport system reliability and safety. Traffic management and emergency systems on the road, using, among others, automatic detection, video surveillance, communication technologies and institutional solutions improve the organization of the work of various departments involved in traffic and safety management. Automation of incident management helps to reduce the time of a rescue operation as well as of the normalization of the flow of traffic after completion of a rescue operation, which also affects the reduction of the risk of secondary accidents and contributes to reducing their severity. The paper presents the possibility of including city traffic departments in the process of incident management. The results of research on the automatic incident detection in cities are also presented.

  13. Design of electric control system for automatic vegetable bundling machine

    Science.gov (United States)

    Bao, Yan

    2017-06-01

    A design can meet the requirements of automatic bale food structure and has the advantages of simple circuit, and the volume is easy to enhance the electric control system of machine carrying bunch of dishes and low cost. The bundle of vegetable machine should meet the sensor to detect and control, in order to meet the control requirements; binding force can be adjusted by the button to achieve; strapping speed also can be adjusted, by the keys to set; sensors and mechanical line connection, convenient operation; can be directly connected with the plug, the 220V power supply can be connected to a power source; if, can work, by the transmission signal sensor, MCU to control the motor, drive and control procedures for small motor. The working principle of LED control circuit and temperature control circuit is described. The design of electric control system of automatic dish machine.

  14. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  15. Development of an automatic characterisation system for silicon detectors

    Science.gov (United States)

    Hacker, J.; Bergauer, T.; Krammer, M.; Wedenig, R.

    2002-06-01

    The CMS experiment will be equipped with the largest silicon tracker in the world. The tracker will consist of about 25,000 silicon sensors which will cover an area of more than 200 m2. Four quality test centres will carry out various checks on a representative sample of sensors to assure a homogeneous quality throughout the 2 {1}/{2} years of production. One of these centres is based in Vienna. To cope with the large number of sensors a fast and fully automatic characterisation system has been realised. We developed the software in LabView and built a cost-efficient probe station in house by assembling individual components and commercial instruments. Both the global properties of a sensor and the characteristic quantities of the individual strips can be measured. The measured data are immediately analysed and sent to a central database. The mechanical and electrical set-up will be explained and results from CMS prototype sensors are presented.

  16. Development of an automatic characterisation system for silicon detectors

    CERN Document Server

    Hacker, J; Krammer, M; Wedenig, R

    2002-01-01

    The CMS experiment will be equipped with the largest silicon tracker in the world. The tracker will consist of about 25,000 silicon sensors which will cover an area of more than 200 m sup 2. Four quality test centres will carry out various checks on a representative sample of sensors to assure a homogeneous quality throughout the 2((1)/(2)) years of production. One of these centres is based in Vienna. To cope with the large number of sensors a fast and fully automatic characterisation system has been realised. We developed the software in LabView and built a cost-efficient probe station in house by assembling individual components and commercial instruments. Both the global properties of a sensor and the characteristic quantities of the individual strips can be measured. The measured data are immediately analysed and sent to a central database. The mechanical and electrical set-up will be explained and results from CMS prototype sensors are presented.

  17. An Automatic Car Counting System Using OverFeat Framework

    OpenAIRE

    Biswas, Debojit; Su, Hongbo; Wang, Chengyi; Blankenship, Jason; Stevanovic, Aleksandar

    2017-01-01

    Automatic car counting is an important component in the automated traffic system. Car counting is very important to understand the traffic load and optimize the traffic signals. In this paper, we implemented the Gaussian Background Subtraction Method and OverFeat Framework to count cars. OverFeat Framework is a combination of Convolution Neural Network (CNN) and one machine learning classifier (like Support Vector Machines (SVM) or Logistic Regression). With this study, we showed another poss...

  18. Automatic assessment of functional health decline in older adults based on smart home data.

    Science.gov (United States)

    Aramendi, Ane Alberdi; Weakley, Alyssa; Goenaga, Asier Aztiria; Schmitter-Edgecombe, Maureen; Cook, Diane J

    2018-03-15

    In the context of an aging population, tools to help elderly to live independently must be developed. The goal of this paper is to evaluate the possibility of using unobtrusively collected activity-aware smart home behavioral data to automatically detect one of the most common consequences of aging: functional health decline. After gathering the longitudinal smart home data of 29 older adults for an average of > 2 years, we automatically labeled the data with corresponding activity classes and extracted time-series statistics containing 10 behavioral features. Using this data, we created regression models to predict absolute and standardized functional health scores, as well as classification models to detect reliable absolute change and positive and negative fluctuations in everyday functioning. Functional health was assessed every six months by means of the Instrumental Activities of Daily Living-Compensation (IADL-C) scale. Results show that total IADL-C score and subscores can be predicted by means of activity-aware smart home data, as well as a reliable change in these scores. Positive and negative fluctuations in everyday functioning are harder to detect using in-home behavioral data, yet changes in social skills have shown to be predictable. Future work must focus on improving the sensitivity of the presented models and performing an in-depth feature selection to improve overall accuracy. Copyright © 2018. Published by Elsevier Inc.

  19. Automatic optical inspection system design for golf ball

    Science.gov (United States)

    Wu, Hsien-Huang; Su, Jyun-Wei; Chen, Chih-Lin

    2016-09-01

    ith the growing popularity of golf sport all over the world, the quantities of relevant products are increasing year by year. To create innovation and improvement in quality while reducing production cost, automation of manufacturing become a necessary and important issue. This paper reflect the trend of this production automa- tion. It uses the AOI (Automated Optical Inspection) technology to develop a system which can automatically detect defects on the golf ball. The current manual quality-inspection is not only error-prone but also very man- power demanding. Taking into consideration the competition of this industry in the near future, the development of related AOI equipment must be conducted as soon as possible. Due to the strong reflective property of the ball surface, as well as its surface dimples and subtle flaws, it is very difficult to take good quality image for automatic inspection. Based on the surface properties and shape of the ball, lighting has been properly design for image-taking environment and structure. Area-scan cameras have been used to acquire images with good contrast between defects and background to assure the achievement of the goal of automatic defect detection on the golf ball. The result obtained is that more than 973 of the NG balls have be detected, and system maintains less than 103 false alarm rate. The balls which are determined by the system to be NG will be inspected by human eye again. Therefore, the manpower spent in the inspection has been reduced by 903.

  20. Modeling and Prototyping of Automatic Clutch System for Light Vehicles

    Science.gov (United States)

    Murali, S.; Jothi Prakash, V. M.; Vishal, S.

    2017-03-01

    Nowadays, recycling or regenerating the waste in to something useful is appreciated all around the globe. It reduces greenhouse gas emissions that contribute to global climate change. This study deals with provision of the automatic clutch mechanism in vehicles to facilitate the smooth changing of gears. This study proposed to use the exhaust gases which are normally expelled out as a waste from the turbocharger to actuate the clutch mechanism in vehicles to facilitate the smooth changing of gears. At present, clutches are operated automatically by using an air compressor in the four wheelers. In this study, a conceptual design is proposed in which the clutch is operated by the exhaust gas from the turbocharger and this will remove the usage of air compressor in the existing system. With this system, usage of air compressor is eliminated and the riders need not to operate the clutch manually. This work involved in development, analysation and validation of the conceptual design through simulation software. Then the developed conceptual design of an automatic pneumatic clutch system is tested with proto type.