WorldWideScience

Sample records for ground-based automated separation

  1. Function Allocation between Automation and Human Pilot for Airborne Separation Assurance

    Science.gov (United States)

    Idris, Husni; Enea, Gabriele; Lewis, TImothy A.

    2016-01-01

    Maintaining safe separation between aircraft is a key determinant of the airspace capacity to handle air transportation. With the advent of satellite-based surveillance, aircraft equipped with the needed technologies are now capable of maintaining awareness of their location in the airspace and sharing it with their surrounding traffic. As a result, concepts and cockpit automation are emerging to enable delegating the responsibility of maintaining safe separation from traffic to the pilot; thus increasing the airspace capacity by alleviating the limitation of the current non-scalable centralized ground-based system. In this paper, an analysis of allocating separation assurance functions to the human pilot and cockpit automation is presented to support the design of these concepts and technologies. A task analysis was conducted with the help of Petri nets to identify the main separation assurance functions and their interactions. Each function was characterized by three behavior levels that may be needed to perform the task: skill, rule and knowledge based levels. Then recommendations are made for allocating each function to an automation scale based on their behavior level characterization and with the help of Subject matter experts.

  2. Transitioning Resolution Responsibility between the Controller and Automation Team in Simulated NextGen Separation Assurance

    Science.gov (United States)

    Cabrall, C.; Gomez, A.; Homola, J.; Hunt, S..; Martin, L.; Merccer, J.; Prevott, T.

    2013-01-01

    As part of an ongoing research effort on separation assurance and functional allocation in NextGen, a controller- in-the-loop study with ground-based automation was conducted at NASA Ames' Airspace Operations Laboratory in August 2012 to investigate the potential impact of introducing self-separating aircraft in progressively advanced NextGen timeframes. From this larger study, the current exploratory analysis of controller-automation interaction styles focuses on the last and most far-term time frame. Measurements were recorded that firstly verified the continued operational validity of this iteration of the ground-based functional allocation automation concept in forecast traffic densities up to 2x that of current day high altitude en-route sectors. Additionally, with greater levels of fully automated conflict detection and resolution as well as the introduction of intervention functionality, objective and subjective analyses showed a range of passive to active controller- automation interaction styles between the participants. Not only did the controllers work with the automation to meet their safety and capacity goals in the simulated future NextGen timeframe, they did so in different ways and with different attitudes of trust/use of the automation. Taken as a whole, the results showed that the prototyped controller-automation functional allocation framework was very flexible and successful overall.

  3. Pilot and Controller Evaluations of Separation Function Allocation in Air Traffic Management

    Science.gov (United States)

    Wing, David; Prevot, Thomas; Morey, Susan; Lewis, Timothy; Martin, Lynne; Johnson, Sally; Cabrall, Christopher; Como, Sean; Homola, Jeffrey; Sheth-Chandra, Manasi; style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20130014930'); toggleEditAbsImage('author_20130014930_show'); toggleEditAbsImage('author_20130014930_hide'); "> style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20130014930_show"> style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20130014930_hide">

    2013-01-01

    Two human-in-the-loop simulation experiments were conducted in coordinated fashion to investigate the allocation of separation assurance functions between ground and air and between humans and automation. The experiments modeled a mixed-operations concept in which aircraft receiving ground-based separation services shared the airspace with aircraft providing their own separation service (i.e., self-separation). Ground-based separation was provided by air traffic controllers without automation tools, with tools, or by ground-based automation with controllers in a managing role. Airborne self-separation was provided by airline pilots using self-separation automation enabled by airborne surveillance technology. The two experiments, one pilot-focused and the other controller-focused, addressed selected key issues of mixed operations, assuming the starting point of current-day operations and modeling an emergence of NextGen technologies and procedures. In the controller-focused experiment, the impact of mixed operations on controller performance was assessed at four stages of NextGen implementation. In the pilot-focused experiment, the limits to which pilots with automation tools could take full responsibility for separation from ground-controlled aircraft were tested. Results indicate that the presence of self-separating aircraft had little impact on the controllers' ability to provide separation services for ground-controlled aircraft. Overall performance was best in the most automated environment in which all aircraft were data communications equipped, ground-based separation was highly automated, and self-separating aircraft had access to trajectory intent information for all aircraft. In this environment, safe, efficient, and highly acceptable operations could be achieved for twice today's peak airspace throughput. In less automated environments, reduced trajectory intent exchange and manual air traffic control limited the safely achievable airspace throughput and

  4. Automation of column-based radiochemical separations. A comparison of fluidic, robotic, and hybrid architectures

    Energy Technology Data Exchange (ETDEWEB)

    Grate, J.W.; O' Hara, M.J.; Farawila, A.F.; Ozanich, R.M.; Owsley, S.L. [Pacific Northwest National Laboratory, Richland, WA (United States)

    2011-07-01

    Two automated systems have been developed to perform column-based radiochemical separation procedures. These new systems are compared with past fluidic column separation architectures, with emphasis on using disposable components so that no sample contacts any surface that any other sample has contacted, and setting up samples and columns in parallel for subsequent automated processing. In the first new approach, a general purpose liquid handling robot has been modified and programmed to perform anion exchange separations using 2 mL bed columns in 6 mL plastic disposable column bodies. In the second new approach, a fluidic system has been developed to deliver clean reagents through disposable manual valves to six disposable columns, with a mechanized fraction collector that positions one of four rows of six vials below the columns. The samples are delivered to each column via a manual 3-port disposable valve from disposable syringes. This second approach, a hybrid of fluidic and mechanized components, is a simpler more efficient approach for performing anion exchange procedures for the recovery and purification of plutonium from samples. The automation architectures described can also be adapted to column-based extraction chromatography separations. (orig.)

  5. Technology Transfer Opportunities: Automated Ground-Water Monitoring

    Science.gov (United States)

    Smith, Kirk P.; Granato, Gregory E.

    1997-01-01

    Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.

  6. Development of a relatively cheap and simple automated separation system for a routine separation procedure based on extraction chromatography

    International Nuclear Information System (INIS)

    Petro Zoriy; Reinhold Flucht; Mechthild Burow; Peter Ostapczuk; Reinhard Lennartz; Myroslav Zoriy

    2010-01-01

    A robust analytical method has been developed in our laboratory for the separation of radionuclides by means of extraction chromatography using an automated separation system. The proposed method is both cheap and simple and provides the advantageous, rapid and accurate separation of the element of interest. The automated separation system enables a shorter separation time by maintaining a constant flow rate of solution and by avoiding clogging or bubbling in the chromatographic column. The present separation method was tested with two types of samples (water and urine) using UTEVA-, TRU- and Sr-specific resins for the separation of U, Th, Am, Pu and Sr. The total separation time for one radionuclide ranged from 60 to 100 min with the separation yield ranging from 68 to 98% depending on the elements separated. We used ICP-QMS, multi-low-level counter and alpha spectroscopy to measure the corresponding elements. (author)

  7. Novel automated blood separations validate whole cell biomarkers.

    Directory of Open Access Journals (Sweden)

    Douglas E Burger

    Full Text Available Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs. Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs of fresh blood samples.To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes.Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials.

  8. Novel automated blood separations validate whole cell biomarkers.

    Science.gov (United States)

    Burger, Douglas E; Wang, Limei; Ban, Liqin; Okubo, Yoshiaki; Kühtreiber, Willem M; Leichliter, Ashley K; Faustman, Denise L

    2011-01-01

    Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs). Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs) of fresh blood samples. To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes. Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials.

  9. Evaluation of High Density Air Traffic Operations with Automation for Separation Assurance, Weather Avoidance and Schedule Conformance

    Science.gov (United States)

    Prevot, Thomas; Mercer, Joey S.; Martin, Lynne Hazel; Homola, Jeffrey R.; Cabrall, Christopher D.; Brasil, Connie L.

    2011-01-01

    In this paper we discuss the development and evaluation of our prototype technologies and procedures for far-term air traffic control operations with automation for separation assurance, weather avoidance and schedule conformance. Controller-in-the-loop simulations in the Airspace Operations Laboratory at the NASA Ames Research Center in 2010 have shown very promising results. We found the operations to provide high airspace throughput, excellent efficiency and schedule conformance. The simulation also highlighted areas for improvements: Short-term conflict situations sometimes resulted in separation violations, particularly for transitioning aircraft in complex traffic flows. The combination of heavy metering and growing weather resulted in an increased number of aircraft penetrating convective weather cells. To address these shortcomings technologies and procedures have been improved and the operations are being re-evaluated with the same scenarios. In this paper we will first describe the concept and technologies for automating separation assurance, weather avoidance, and schedule conformance. Second, the results from the 2010 simulation will be reviewed. We report human-systems integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. Next, improvements will be discussed that were made to address identified shortcomings. We conclude that, with further refinements, air traffic control operations with ground-based automated separation assurance can routinely provide currently unachievable levels of traffic throughput in the en route airspace.

  10. Human-Automation Cooperation for Separation Assurance in Future NextGen Environments

    Science.gov (United States)

    Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas

    2014-01-01

    A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.

  11. Entropy-based automated classification of independent components separated from fMCG

    International Nuclear Information System (INIS)

    Comani, S; Srinivasan, V; Alleva, G; Romani, G L

    2007-01-01

    Fetal magnetocardiography (fMCG) is a noninvasive technique suitable for the prenatal diagnosis of the fetal heart function. Reliable fetal cardiac signals can be reconstructed from multi-channel fMCG recordings by means of independent component analysis (ICA). However, the identification of the separated components is usually accomplished by visual inspection. This paper discusses a novel automated system based on entropy estimators, namely approximate entropy (ApEn) and sample entropy (SampEn), for the classification of independent components (ICs). The system was validated on 40 fMCG datasets of normal fetuses with the gestational age ranging from 22 to 37 weeks. Both ApEn and SampEn were able to measure the stability and predictability of the physiological signals separated with ICA, and the entropy values of the three categories were significantly different at p <0.01. The system performances were compared with those of a method based on the analysis of the time and frequency content of the components. The outcomes of this study showed a superior performance of the entropy-based system, in particular for early gestation, with an overall ICs detection rate of 98.75% and 97.92% for ApEn and SampEn respectively, as against a value of 94.50% obtained with the time-frequency-based system. (note)

  12. Artificial intelligence costs, benefits, risks for selected spacecraft ground system automation scenarios

    Science.gov (United States)

    Truszkowski, Walter F.; Silverman, Barry G.; Kahn, Martha; Hexmoor, Henry

    1988-01-01

    In response to a number of high-level strategy studies in the early 1980s, expert systems and artificial intelligence (AI/ES) efforts for spacecraft ground systems have proliferated in the past several years primarily as individual small to medium scale applications. It is useful to stop and assess the impact of this technology in view of lessons learned to date, and hopefully, to determine if the overall strategies of some of the earlier studies both are being followed and still seem relevant. To achieve that end four idealized ground system automation scenarios and their attendant AI architecture are postulated and benefits, risks, and lessons learned are examined and compared. These architectures encompass: (1) no AI (baseline), (2) standalone expert systems, (3) standardized, reusable knowledge base management systems (KBMS), and (4) a futuristic unattended automation scenario. The resulting artificial intelligence lessons learned, benefits, and risks for spacecraft ground system automation scenarios are described.

  13. The automated ground network system

    Science.gov (United States)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  14. Achieving Lights-Out Operation of SMAP Using Ground Data System Automation

    Science.gov (United States)

    Sanders, Antonio

    2013-01-01

    The approach used in the SMAP ground data system to provide reliable, automated capabilities to conduct unattended operations has been presented. The impacts of automation on the ground data system architecture were discussed, including the three major automation patterns identified for SMAP and how these patterns address the operations use cases. The architecture and approaches used by SMAP will set the baseline for future JPL Earth Science missions.

  15. Artificial intelligence costs, benefits, and risks for selected spacecraft ground system automation scenarios

    Science.gov (United States)

    Truszkowski, Walter F.; Silverman, Barry G.; Kahn, Martha; Hexmoor, Henry

    1988-01-01

    In response to a number of high-level strategy studies in the early 1980s, expert systems and artificial intelligence (AI/ES) efforts for spacecraft ground systems have proliferated in the past several years primarily as individual small to medium scale applications. It is useful to stop and assess the impact of this technology in view of lessons learned to date, and hopefully, to determine if the overall strategies of some of the earlier studies both are being followed and still seem relevant. To achieve that end four idealized ground system automation scenarios and their attendant AI architecture are postulated and benefits, risks, and lessons learned are examined and compared. These architectures encompass: (1) no AI (baseline); (2) standalone expert systems; (3) standardized, reusable knowledge base management systems (KBMS); and (4) a futuristic unattended automation scenario. The resulting artificial intelligence lessons learned, benefits, and risks for spacecraft ground system automation scenarios are described.

  16. A Near-Term Concept for Trajectory Based Operations with Air/Ground Data Link Communication

    Science.gov (United States)

    McNally, David; Mueller, Eric; Thipphavong, David; Paielli, Russell; Cheng, Jinn-Hwei; Lee, Chuhan; Sahlman, Scott; Walton, Joe

    2010-01-01

    An operating concept and required system components for trajectory-based operations with air/ground data link for today's en route and transition airspace is proposed. Controllers are fully responsible for separation as they are today, and no new aircraft equipage is required. Trajectory automation computes integrated solutions to problems like metering, weather avoidance, traffic conflicts and the desire to find and fly more time/fuel efficient flight trajectories. A common ground-based system supports all levels of aircraft equipage and performance including those equipped and not equipped for data link. User interface functions for the radar controller's display make trajectory-based clearance advisories easy to visualize, modify if necessary, and implement. Laboratory simulations (without human operators) were conducted to test integrated operation of selected system components with uncertainty modeling. Results are based on 102 hours of Fort Worth Center traffic recordings involving over 37,000 individual flights. The presence of uncertainty had a marginal effect (5%) on minimum-delay conflict resolution performance, and windfavorable routes had no effect on detection and resolution metrics. Flight plan amendments and clearances were substantially reduced compared to today s operations. Top-of-descent prediction errors are the largest cause of failure indicating that better descent predictions are needed to reliably achieve fuel-efficient descent profiles in medium to heavy traffic. Improved conflict detections for climbing flights could enable substantially more continuous climbs to cruise altitude. Unlike today s Conflict Alert, tactical automation must alert when an altitude amendment is entered, but before the aircraft starts the maneuver. In every other failure case tactical automation prevented losses of separation. A real-time prototype trajectory trajectory-automation system is running now and could be made ready for operational testing at an en route

  17. Multi-column step-gradient chromatography system for automated ion exchange separations

    International Nuclear Information System (INIS)

    Rucker, T.L.

    1985-01-01

    A multi-column step-gradient chromatography system has been designed to perform automated sequential separations of radionuclides by ion exchange chromatography. The system consists of a digital programmer with automatic stream selection valve, two peristaltic pumps, ten columns, and a fraction collector. The automation allows complicated separations of radionuclides to be made with minimal analyst attention and allows for increased productivity and reduced cost of analyses. Results are reported for test separations on mixtures of radionuclides by the system

  18. Automated ion-exchange system for the radiochemical separation of the noble metals

    International Nuclear Information System (INIS)

    Parry, S.J.

    1980-01-01

    Ion-exchange separation is particularly suitable for mechanisation and automated ion exchange has been applied to the activation analysis of biological and environmental samples. In this work a system has been designed for experimental studies, which can be adapted for different modes of operation. The equipment is based on a large-volume sampler for the automatic presentation of 500 ml of liquid to a sampling probe. The sample is delivered to the ion-exchange column by means of a peristaltic pump. The purpose of this work was to automate a procedure for separating the noble metals from irradiated geological samples, for neutron-activation analysis. The process of digesting the rock sample is carried out manually in 30 min and is not suited to unattended operation. The volume of the resulting liquid sample may be 100 ml and so the manual separation step may take as long as 1.25 h per sample. The reason for automating this part of the procedure is to reduce the separation time for a group of five samples and consequently to improve the sensitivity of the analysis for radionuclides with short half-lives. This paper describes the automatic ion-exchange system and the ways in which it can be used. The mode of operation for the separation of the noble metals is given in detail. The reproducibility of the system has been assessed by repeated measurements on a standard reference matte. (author)

  19. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    Science.gov (United States)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  20. HE 1113-0641: THE SMALLEST-SEPARATION QUADRUPLE LENS IDENTIFIED BY A GROUND-BASED OPTICAL TELESCOPE

    International Nuclear Information System (INIS)

    Blackburne, Jeffrey A.; Schechter, Paul L.; Wisotzki, Lutz

    2008-01-01

    The Hamburg/ESO quasar HE 1113-0641 is found to be a quadruple gravitational lens, based on observations with the twin 6.5 m Magellan telescopes at the Las Campanas Observatory, and subsequently with the Hubble Space Telescope. The z S = 1.235 quasar appears in a cross configuration, with i' band magnitudes ranging from 18.0 to 18.8. With a maximum image separation of 0''.67, this is the smallest-separation quadruple ever identified using a ground-based optical telescope. Point-spread function (PSF) subtraction reveals a faint lensing galaxy. A simple lens model succeeds in predicting the observed positions of the components, but fails to match their observed flux ratios by up to a magnitude. We estimate the redshift of the lensing galaxy to be z L ∼ 0.7. Time delay estimates are on the order of a day, suggesting that the flux ratio anomalies are not due to variability of the quasar, but may result from substructure or microlensing in the lens galaxy.

  1. Conflict Resolution Automation and Pilot Situation Awareness

    Science.gov (United States)

    Dao, Arik-Quang V.; Brandt, Summer L.; Bacon, Paige; Kraut, Josh; Nguyen, Jimmy; Minakata, Katsumi; Raza, Hamzah; Rozovski, David; Johnson, Walter W.

    2010-01-01

    This study compared pilot situation awareness across three traffic management concepts. The Concepts varied in terms of the allocation of traffic avoidance responsibility between the pilot on the flight deck, the air traffic controllers, and a conflict resolution automation system. In Concept 1, the flight deck was equipped with conflict resolution tools that enable them to fully handle the responsibility of weather avoidance and maintaining separation between ownship and surrounding traffic. In Concept 2, pilots were not responsible for traffic separation, but were provided tools for weather and traffic avoidance. In Concept 3, flight deck tools allowed pilots to deviate for weather, but conflict detection tools were disabled. In this concept pilots were dependent on ground based automation for conflict detection and resolution. Situation awareness of the pilots was measured using online probes. Results showed that individual situation awareness was highest in Concept 1, where the pilots were most engaged, and lowest in Concept 3, where automation was heavily used. These findings suggest that for conflict resolution tasks, situation awareness is improved when pilots remain in the decision-making loop.

  2. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    Science.gov (United States)

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  3. Automation of radiochemical analysis by flow injection techniques. Am-Pu separation using TRU-resinTM sorbent extraction column

    International Nuclear Information System (INIS)

    Egorov, O.; Washington Univ., Seattle, WA; Grate, J.W.; Ruzicka, J.

    1998-01-01

    A rapid automated flow injection analysis (FIA) procedure was developed for efficient separation of Am and Pu from each other and from interfering matrix and radionuclide components using a TRU-resin TM column. Selective Pu elution is enabled via on-column reduction. The separation was developed using on-line radioactivity detection. After the separation had been developed, fraction collection was used to obtain the separated fractions. In this manner, a FIA instrument functions as an automated separation workstation capable of unattended operation. (author)

  4. A Chip-Capillary Hybrid Device for Automated Transfer of Sample Pre-Separated by Capillary Isoelectric Focusing to Parallel Capillary Gel Electrophoresis for Two-Dimensional Protein Separation

    Science.gov (United States)

    Lu, Joann J.; Wang, Shili; Li, Guanbin; Wang, Wei; Pu, Qiaosheng; Liu, Shaorong

    2012-01-01

    In this report, we introduce a chip-capillary hybrid device to integrate capillary isoelectric focusing (CIEF) with parallel capillary sodium dodecyl sulfate – polyacrylamide gel electrophoresis (SDS-PAGE) or capillary gel electrophoresis (CGE) toward automating two-dimensional (2D) protein separations. The hybrid device consists of three chips that are butted together. The middle chip can be moved between two positions to re-route the fluidic paths, which enables the performance of CIEF and injection of proteins partially resolved by CIEF to CGE capillaries for parallel CGE separations in a continuous and automated fashion. Capillaries are attached to the other two chips to facilitate CIEF and CGE separations and to extend the effective lengths of CGE columns. Specifically, we illustrate the working principle of the hybrid device, develop protocols for producing and preparing the hybrid device, and demonstrate the feasibility of using this hybrid device for automated injection of CIEF-separated sample to parallel CGE for 2D protein separations. Potentials and problems associated with the hybrid device are also discussed. PMID:22830584

  5. Automated drumlin shape and volume estimation using high resolution LiDAR imagery (Curvature Based Relief Separation): A test from the Wadena Drumlin Field, Minnesota

    Science.gov (United States)

    Yu, Peter; Eyles, Nick; Sookhan, Shane

    2015-10-01

    Resolving the origin(s) of drumlins and related megaridges in areas of megascale glacial lineations (MSGL) left by paleo-ice sheets is critical to understanding how ancient ice sheets interacted with their sediment beds. MSGL is now linked with fast-flowing ice streams but there is a broad range of erosional and depositional models. Further progress is reliant on constraining fluxes of subglacial sediment at the ice sheet base which in turn is dependent on morphological data such as landform shape and elongation and most importantly landform volume. Past practice in determining shape has employed a broad range of geomorphological methods from strictly visualisation techniques to more complex semi-automated and automated drumlin extraction methods. This paper reviews and builds on currently available visualisation, semi-automated and automated extraction methods and presents a new, Curvature Based Relief Separation (CBRS) technique; for drumlin mapping. This uses curvature analysis to generate a base level from which topography can be normalized and drumlin volume can be derived. This methodology is tested using a high resolution (3 m) LiDAR elevation dataset from the Wadena Drumlin Field, Minnesota, USA, which was constructed by the Wadena Lobe of the Laurentide Ice Sheet ca. 20,000 years ago and which as a whole contains 2000 drumlins across an area of 7500 km2. This analysis demonstrates that CBRS provides an objective and robust procedure for automated drumlin extraction. There is strong agreement with manually selected landforms but the method is also capable of resolving features that were not detectable manually thereby considerably expanding the known population of streamlined landforms. CBRS provides an effective automatic method for visualisation of large areas of the streamlined beds of former ice sheets and for modelling sediment fluxes below ice sheets.

  6. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  7. Design and analysis on sorting blade for automated size-based sorting device

    Science.gov (United States)

    Razali, Zol Bahri; Kader, Mohamed Mydin M. Abdul; Samsudin, Yasser Suhaimi; Daud, Mohd Hisam

    2017-09-01

    Nowadays rubbish separating or recycling is a main problem of nation, where peoples dumped their rubbish into dumpsite without caring the value of the rubbish if it can be recycled and reused. Thus the author proposed an automated segregating device, purposely to teach people to separate their rubbish and value the rubbish that can be reused. The automated size-based mechanical segregating device provides significant improvements in terms of efficiency and consistency in this segregating process. This device is designed to make recycling easier, user friendly, in the hope that more people will take responsibility if it is less of an expense of time and effort. This paper discussed about redesign a blade for the sorting device which is to develop an efficient automated mechanical sorting device for the similar material but in different size. The machine is able to identify the size of waste and it depends to the coil inside the container to separate it out. The detail design and methodology is described in detail in this paper.

  8. Ground and space-based separate PSF photometry of Pluto and Charon from New Horizons and Magellan

    Science.gov (United States)

    Zangari, Amanda M.; Stern, S. A.; Young, L. A.; Weaver, H. A.; Olkin, C.; Buratti, B. J.; Spencer, J.; Ennico, K.

    2013-10-01

    While Pluto and Charon are easily resolvable in some space-based telescopes, ground-based imaging of Pluto and Charon can yield separate PSF photometry in excellent seeing. We present B and Sloan g', r', i', and z' separate photometry of Pluto and Charon taken at the Magellan Clay telescope using LDSS-3. In 2011, observations were made on 7, 8, 9, 19, and 20 March, at 9:00 UT, covering sub-Earth longitudes 130°, 74°, 17°, 175° and 118°. The solar phase angle ranged from 1.66-1.68° to 1.76-1.77°. In 2012, observations were made on February 28, 29 and March 1 at 9:00 UT covering longitudes 342°, 110° and 53° and on May 30 and 31 at 9:30 UT and 7:00 UT, covering longitudes 358° and 272°. Solar phase angles were 1.53-1.56° and 0.89°-0.90° degrees. All longitudes use the convention of zero at the sub-Charon longitude and decrease in time. Seeing ranged from 0.46 to 1.26 arcsecond. We find that the mean rotationally-averaged Charon-to-Pluto light ratio is 0.142±0.003 for Sloan r',i' and z'. Charon is brighter in B and g', with a light ratio of 0.182±0.003 and 0.178±0.002 respectively. Additionally, we present separate PSF photometry of Pluto and Charon from New Horizons images taken by the LORRI instrument on 1 and 3 July 2013 at 17:00 UT and 23:00 UT, sub-Earth longitude 251° and 125°. We find that the rotation-dependent variations in the light ratio are consistent with earlier estimates such as those from Buie et al. 2010, AJ 139, 1117-1127. However, at a solar phase angle of 10.9°, Charon appears 0.25 magnitudes fainter relative to Pluto at the same rotational phase than measurements from the ground with the largest possible solar phase angle. Thus we provide the first estimate of a Pluto phase curve beyond 2°. These results represent some of the first Pluto science from New Horizons. This work has been funded in part by NASA Planetary Astronomy Grant NNX10AB27G and NSF Award 0707609 to MIT and by NASA's New Horizons mission to Pluto.

  9. Automating the SMAP Ground Data System to Support Lights-Out Operations

    Science.gov (United States)

    Sanders, Antonio

    2014-01-01

    The Soil Moisture Active Passive (SMAP) Mission is a first tier mission in NASA's Earth Science Decadal Survey. SMAP will provide a global mapping of soil moisture and its freeze/thaw states. This mapping will be used to enhance the understanding of processes that link the terrestrial water, energy, and carbon cycles, and to enhance weather and forecast capabilities. NASA's Jet Propulsion Laboratory has been selected as the lead center for the development and operation of SMAP. The Jet Propulsion Laboratory (JPL) has an extensive history of successful deep space exploration. JPL missions have typically been large scale Class A missions with significant budget and staffing. SMAP represents a new area of JPL focus towards low cost Earth science missions. Success in this new area requires changes to the way that JPL has traditionally provided the Mission Operations System (MOS)/Ground Data System (GDS) functions. The operation of SMAP requires more routine operations activities and support for higher data rates and data volumes than have been achieved in the past. These activities must be addressed by a reduced operations team and support staff. To meet this challenge, the SMAP ground data system provides automation that will perform unattended operations, including automated commanding of the SMAP spacecraft.

  10. A reactor/separator device for use in automated solid phase immunoassay

    International Nuclear Information System (INIS)

    Farina, P.R.; Ordonez, K.P.; Siewers, I.J.

    1979-01-01

    A reactor/separator device is described for use in automated solid phase immunoassay, including radioimmunoassays. The device is a column fitted at the bottom portion with a water impermeable disc which can hold, for example, immunoabsorbents, immobilized antisera or ion exchange resins. When the contents of the column supported by the disc are brought into contact with an aqueous phase containing reagents or reactants, a chemical reaction is initiated. After the reaction, centrifugally applied pressure forces the aqueous phase through the filter disc making it water permeable and separating a desired component for subsequent analysis. The reactor/separator device of the present invention permits kinetic solid phase assays (non-equilibrium conditions) to be carried out which would be difficult to perform by other conventional methods. (author)

  11. Airborne Separation Assurance and Traffic Management: Research of Concepts and Technology

    Science.gov (United States)

    Ballin, Mark G.; Wing, David J.; Hughes, Monica F.; Conway, Sheila R.

    1999-01-01

    To support the need for increased flexibility and capacity in the future National Airspace System, NASA is pursuing an approach that distributes air traffic separation and management tasks to both airborne and ground-based systems. Details of the distributed operations and the benefits and technical challenges of such a system are discussed. Technology requirements and research issues are outlined, and NASA s approach for establishing concept feasibility, which includes development of the airborne automation necessary to support the concept, is described.

  12. Some problems concenrning the use of automated radiochemical separation systems in destructive neutron activation analysis

    International Nuclear Information System (INIS)

    Nagy, L.G.; Toeroek, G.

    1977-01-01

    The present state of a long term program is reviewed. It was started to elaborate a remote controlled automated radiochemical processing system for the neutron activation analysis of biological materials. The system is based on wet ashing of the sample followed by reactive desorption of some volatile components. The distillation residue is passed through a series of columns filled with selective ion screening materials to remove the matrix activity. The solution is thus ''stripped'' from the interfering radioions, and it is processed to single-elements through group separations using ion-exchange chromatographic techniques. Some special problems concerning this system are treated. (a) General aspects of the construction of a (semi)automated radiochemical processing system are discussed. (b) Comparison is made between various technical realizations of the same basic concept. (c) Some problems concerning the ''reconstruction'' of an already published processing system are outlined. (T.G.)

  13. An automated system for selective fission product separations; decays of 113-115Pd

    International Nuclear Information System (INIS)

    Meikrantz, D.H.; Gehrke, R.J.; McIsaac, L.D.; Baker, J.D.; Greenwood, R.C.

    1981-01-01

    A microcomputer controlled radiochemical separation system has been developed for the isolation and study of fission products with half-lives of approx. >= 10 s. The system is based upon solvent extraction with three centrifugal contactors coupled in series, which provides both rapid and highly efficient separations with large decontamination factors. This automated system was utilized to study the radioactive decays of 113-115 Pd via solvent extraction of the Pd-dimethylglyoxime complex from 252 Cf fission products. As a result of this effort, γ-rays associated with the decay of approx. equal to 90-s sup(113,113m)Pd, 149-s 114 Pd and 47-s 115 Pd have been identified. The isotopic assignments to each of these Pd radioactivities have been confirmed from observation of the growth and decay curves of their respective Ag daughters. In addition, previously unreported Ag γ-rays have been assigned; one to the decay of 69-s 113 Ag, and two to the decay of 19-s 115 Ag. (orig.)

  14. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  15. Automated Conflict Resolution, Arrival Management and Weather Avoidance for ATM

    Science.gov (United States)

    Erzberger, H.; Lauderdale, Todd A.; Chu, Yung-Cheng

    2010-01-01

    The paper describes a unified solution to three types of separation assurance problems that occur in en-route airspace: separation conflicts, arrival sequencing, and weather-cell avoidance. Algorithms for solving these problems play a key role in the design of future air traffic management systems such as NextGen. Because these problems can arise simultaneously in any combination, it is necessary to develop integrated algorithms for solving them. A unified and comprehensive solution to these problems provides the foundation for a future air traffic management system that requires a high level of automation in separation assurance. The paper describes the three algorithms developed for solving each problem and then shows how they are used sequentially to solve any combination of these problems. The first algorithm resolves loss-of-separation conflicts and is an evolution of an algorithm described in an earlier paper. The new version generates multiple resolutions for each conflict and then selects the one giving the least delay. Two new algorithms, one for sequencing and merging of arrival traffic, referred to as the Arrival Manager, and the other for weather-cell avoidance are the major focus of the paper. Because these three problems constitute a substantial fraction of the workload of en-route controllers, integrated algorithms to solve them is a basic requirement for automated separation assurance. The paper also reviews the Advanced Airspace Concept, a proposed design for a ground-based system that postulates redundant systems for separation assurance in order to achieve both high levels of safety and airspace capacity. It is proposed that automated separation assurance be introduced operationally in several steps, each step reducing controller workload further while increasing airspace capacity. A fast time simulation was used to determine performance statistics of the algorithm at up to 3 times current traffic levels.

  16. SHARP - Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. G.

    1990-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  17. SHARP: Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. Gaius

    1991-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  18. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  19. Removal of micropollutants with coarse-ground activated carbon for enhanced separation with hydrocyclone classifiers.

    Science.gov (United States)

    Otto, N; Platz, S; Fink, T; Wutscherk, M; Menzel, U

    2016-01-01

    One key technology to eliminate organic micropollutants (OMP) from wastewater effluent is adsorption using powdered activated carbon (PAC). To avoid a discharge of highly loaded PAC particles into natural water bodies a separation stage has to be implemented. Commonly large settling tanks and flocculation filters with the application of coagulants and flocculation aids are used. In this study, a multi-hydrocyclone classifier with a downstream cloth filter has been investigated on a pilot plant as a space-saving alternative with no need for a dosing of chemical additives. To improve the separation, a coarser ground PAC type was compared to a standard PAC type with regard to elimination results of OMP as well as separation performance. With a PAC dosing rate of 20 mg/l an average of 64.7 wt% of the standard PAC and 79.5 wt% of the coarse-ground PAC could be separated in the hydrocyclone classifier. A total average separation efficiency of 93-97 wt% could be reached with a combination of both hydrocyclone classifier and cloth filter. Nonetheless, the OMP elimination of the coarse-ground PAC was not sufficient enough to compete with the standard PAC. Further research and development is necessary to find applicable coarse-grained PAC types with adequate OMP elimination capabilities.

  20. Automated Orthorectification of VHR Satellite Images by SIFT-Based RPC Refinement

    Directory of Open Access Journals (Sweden)

    Hakan Kartal

    2018-06-01

    Full Text Available Raw remotely sensed images contain geometric distortions and cannot be used directly for map-based applications, accurate locational information extraction or geospatial data integration. A geometric correction process must be conducted to minimize the errors related to distortions and achieve the desired location accuracy before further analysis. A considerable number of images might be needed when working over large areas or in temporal domains in which manual geometric correction requires more labor and time. To overcome these problems, new algorithms have been developed to make the geometric correction process autonomous. The Scale Invariant Feature Transform (SIFT algorithm is an image matching algorithm used in remote sensing applications that has received attention in recent years. In this study, the effects of the incidence angle, surface topography and land cover (LC characteristics on SIFT-based automated orthorectification were investigated at three different study sites with different topographic conditions and LC characteristics using Pleiades very high resolution (VHR images acquired at different incidence angles. The results showed that the location accuracy of the orthorectified images increased with lower incidence angle images. More importantly, the topographic characteristics had no observable impacts on the location accuracy of SIFT-based automated orthorectification, and the results showed that Ground Control Points (GCPs are mainly concentrated in the “Forest” and “Semi Natural Area” LC classes. A multi-thread code was designed to reduce the automated processing time, and the results showed that the process performed 7 to 16 times faster using an automated approach. Analyses performed on various spectral modes of multispectral data showed that the arithmetic data derived from pan-sharpened multispectral images can be used in automated SIFT-based RPC orthorectification.

  1. Summary of astronaut inputs on automation and robotics for Space Station Freedom

    Science.gov (United States)

    Weeks, David J.

    1990-01-01

    Astronauts and payload specialists present specific recommendations in the form of an overview that relate to the use of automation and robotics on the Space Station Freedom. The inputs are based on on-orbit operations experience, time requirements for crews, and similar crew-specific knowledge that address the impacts of automation and robotics on productivity. Interview techniques and specific questionnaire results are listed, and the majority of the responses indicate that incorporating automation and robotics to some extent and with human backup can improve productivity. Specific support is found for the use of advanced automation and EVA robotics on the Space Station Freedom and for the use of advanced automation on ground-based stations. Ground-based control of in-flight robotics is required, and Space Station activities and crew tasks should be analyzed to assess the systems engineering approach for incorporating automation and robotics.

  2. 3-D vision and figure-ground separation by visual cortex.

    Science.gov (United States)

    Grossberg, S

    1994-01-01

    A neural network theory of three-dimensional (3-D) vision, called FACADE theory, is described. The theory proposes a solution of the classical figure-ground problem for biological vision. It does so by suggesting how boundary representations and surface representations are formed within a boundary contour system (BCS) and a feature contour system (FCS). The BCS and FCS interact reciprocally to form 3-D boundary and surface representations that are mutually consistent. Their interactions generate 3-D percepts wherein occluding and occluded object parts are separated, completed, and grouped. The theory clarifies how preattentive processes of 3-D perception and figure-ground separation interact reciprocally with attentive processes of spatial localization, object recognition, and visual search. A new theory of stereopsis is proposed that predicts how cells sensitive to multiple spatial frequencies, disparities, and orientations are combined by context-sensitive filtering, competition, and cooperation to form coherent BCS boundary segmentations. Several factors contribute to figure-ground pop-out, including: boundary contrast between spatially contiguous boundaries, whether due to scenic differences in luminance, color, spatial frequency, or disparity; partially ordered interactions from larger spatial scales and disparities to smaller scales and disparities; and surface filling-in restricted to regions surrounded by a connected boundary. Phenomena such as 3-D pop-out from a 2-D picture, Da Vinci stereopsis, 3-D neon color spreading, completion of partially occluded objects, and figure-ground reversals are analyzed. The BCS and FCS subsystems model aspects of how the two parvocellular cortical processing streams that join the lateral geniculate nucleus to prestriate cortical area V4 interact to generate a multiplexed representation of Form-And-Color-And-DEpth, or FACADE, within area V4. Area V4 is suggested to support figure-ground separation and to interact with

  3. Automation and robotics considerations for a lunar base

    Science.gov (United States)

    Sliwa, Nancy E.; Harrison, F. Wallace, Jr.; Soloway, Donald I.; Mckinney, William S., Jr.; Cornils, Karin; Doggett, William R.; Cooper, Eric G.; Alberts, Thomas E.

    1992-01-01

    An envisioned lunar outpost shares with other NASA missions many of the same criteria that have prompted the development of intelligent automation techniques with NASA. Because of increased radiation hazards, crew surface activities will probably be even more restricted than current extravehicular activity in low Earth orbit. Crew availability for routine and repetitive tasks will be at least as limited as that envisioned for the space station, particularly in the early phases of lunar development. Certain tasks are better suited to the untiring watchfulness of computers, such as the monitoring and diagnosis of multiple complex systems, and the perception and analysis of slowly developing faults in such systems. In addition, mounting costs and constrained budgets require that human resource requirements for ground control be minimized. This paper provides a glimpse of certain lunar base tasks as seen through the lens of automation and robotic (A&R) considerations. This can allow a more efficient focusing of research and development not only in A&R, but also in those technologies that will depend on A&R in the lunar environment.

  4. Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.

    Science.gov (United States)

    Peters, Carl N; Evans, Iain E J

    2016-12-01

    Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.

  5. Part-task training in the context of automation: current and future directions.

    Science.gov (United States)

    Gutzwiller, Robert S; Clegg, Benjamin A; Blitch, John G

    2013-01-01

    Automation often elicits a divide-and-conquer outlook. By definition, automation has been suggested to assume control over a part or whole task that was previously performed by a human (Parasuraman & Riley, 1997). When such notions of automation are taken as grounds for training, they readily invoke a part-task training (PTT) approach. This article outlines broad functions of automation as a source of PTT and reviews the PTT literature, focusing on the potential benefits and costs related to using automation as a mechanism for PTT. The article reviews some past work in this area and suggests a path to move beyond the type of work captured by the "automation as PTT" framework. An illustrative experiment shows how automation in training and PTT are actually separable issues. PTT with automation has some utility but ultimately remains an unsatisfactory framework for the future broad potential of automation during training, and we suggest that a new conceptualization is needed.

  6. A Review of Function Allocation and En Route Separation Assurance

    Science.gov (United States)

    Lewis, Timothy A.; Aweiss, Arwa S.; Guerreiro, Nelson M.; Daiker, Ronald J.

    2016-01-01

    Today's air traffic control system has reached a limit to the number of aircraft that can be safely managed at the same time. This air traffic capacity bottleneck is a critical problem along the path to modernization for air transportation. The design of the next separation assurance system to address this problem is a cornerstone of air traffic management research today. This report reviews recent work by NASA and others in the areas of function allocation and en route separation assurance. This includes: separation assurance algorithms and technology prototypes; concepts of operations and designs for advanced separation assurance systems; and specific investigations into air-ground and human-automation function allocation.

  7. Piloted simulation of an air-ground profile negotiation process in a time-based Air Traffic Control environment

    Science.gov (United States)

    Williams, David H.; Green, Steven M.

    1993-01-01

    Historically, development of airborne flight management systems (FMS) and ground-based air traffic control (ATC) systems has tended to focus on different objectives with little consideration for operational integration. A joint program, between NASA's Ames Research Center (Ames) and Langley Research Center (Langley), is underway to investigate the issues of, and develop systems for, the integration of ATC and airborne automation systems. A simulation study was conducted to evaluate a profile negotiation process (PNP) between the Center/TRACON Automation System (CTAS) and an aircraft equipped with a four-dimensional flight management system (4D FMS). Prototype procedures were developed to support the functional implementation of this process. The PNP was designed to provide an arrival trajectory solution which satisfies the separation requirements of ATC while remaining as close as possible to the aircraft's preferred trajectory. Results from the experiment indicate the potential for successful incorporation of aircraft-preferred arrival trajectories in the CTAS automation environment. Fuel savings on the order of 2 percent to 8 percent, compared to fuel required for the baseline CTAS arrival speed strategy, were achieved in the test scenarios. The data link procedures and clearances developed for this experiment, while providing the necessary functionality, were found to be operationally unacceptable to the pilots. In particular, additional pilot control and understanding of the proposed aircraft-preferred trajectory, and a simplified clearance procedure were cited as necessary for operational implementation of the concept.

  8. Drop Size Distribution - Based Separation of Stratiform and Convective Rain

    Science.gov (United States)

    Thurai, Merhala; Gatlin, Patrick; Williams, Christopher

    2014-01-01

    For applications in hydrology and meteorology, it is often desirable to separate regions of stratiform and convective rain from meteorological radar observations, both from ground-based polarimetric radars and from space-based dual frequency radars. In a previous study by Bringi et al. (2009), dual frequency profiler and dual polarization radar (C-POL) observations in Darwin, Australia, had shown that stratiform and convective rain could be separated in the log10(Nw) versus Do domain, where Do is the mean volume diameter and Nw is the scaling parameter which is proportional to the ratio of water content to the mass weighted mean diameter. Note, Nw and Do are two of the main drop size distribution (DSD) parameters. In a later study, Thurai et al (2010) confirmed that both the dual-frequency profiler based stratiform-convective rain separation and the C-POL radar based separation were consistent with each other. In this paper, we test this separation method using DSD measurements from a ground based 2D video disdrometer (2DVD), along with simultaneous observations from a collocated, vertically-pointing, X-band profiling radar (XPR). The measurements were made in Huntsville, Alabama. One-minute DSDs from 2DVD are used as input to an appropriate gamma fitting procedure to determine Nw and Do. The fitted parameters - after averaging over 3-minutes - are plotted against each other and compared with a predefined separation line. An index is used to determine how far the points lie from the separation line (as described in Thurai et al. 2010). Negative index values indicate stratiform rain and positive index indicate convective rain, and, moreover, points which lie somewhat close to the separation line are considered 'mixed' or 'transition' type precipitation. The XPR observations are used to evaluate/test the 2DVD data-based classification. A 'bright-band' detection algorithm was used to classify each vertical reflectivity profile as either stratiform or convective

  9. The ground based plan

    International Nuclear Information System (INIS)

    1989-01-01

    The paper presents a report of ''The Ground Based Plan'' of the United Kingdom Science and Engineering Research Council. The ground based plan is a plan for research in astronomy and planetary science by ground based techniques. The contents of the report contains a description of:- the scientific objectives and technical requirements (the basis for the Plan), the present organisation and funding for the ground based programme, the Plan, the main scientific features and the further objectives of the Plan. (U.K.)

  10. Semi-automated technique for the separation and determination of barium and strontium in surface waters by ion exchange chromatography and atomic emission spectrometry

    International Nuclear Information System (INIS)

    Pierce, F.D.; Brown, H.R.

    1977-01-01

    A semi-automated method for the separation and the analysis of barium and strontium in surface waters by atomic emission spectrometry is described. The method employs a semi-automated separation technique using ion exchange and an automated aspiration-analysis procedure. Forty specimens can be prepared in approximately 90 min and can be analyzed for barium and strontium content in 20 min. The detection limits and sensitivities provided by the described technique are 0.003 mg/l and 0.01 mg/l respectively for barium and 0.00045 mg/l and 0.003 mg/l respectively for strontium

  11. System Performance of an Integrated Airborne Spacing Algorithm with Ground Automation

    Science.gov (United States)

    Swieringa, Kurt A.; Wilson, Sara R.; Baxley, Brian T.

    2016-01-01

    The National Aeronautics and Space Administration's (NASA's) first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the Terminal airspace; Controller Managed Spacing (CMS), which provides controllers with decision support tools to enable precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain precise spacing behind another aircraft. Recent simulations and IM algorithm development at NASA have focused on trajectory-based IM operations where aircraft equipped with IM avionics are expected to achieve a spacing goal, assigned by air traffic controllers, at the final approach fix. The recently published IM Minimum Operational Performance Standards describe five types of IM operations. This paper discusses the results and conclusions of a human-in-the-loop simulation that investigated three of those IM operations. The results presented in this paper focus on system performance and integration metrics. Overall, the IM operations conducted in this simulation integrated well with ground-based decisions support tools and certain types of IM operational were able to provide improved spacing precision at the final approach fix; however, some issues were identified that should be addressed prior to implementing IM procedures into real-world operations.

  12. Testing Automation of Context-Oriented Programs Using Separation Logic

    Directory of Open Access Journals (Sweden)

    Mohamed A. El-Zawawy

    2014-01-01

    Full Text Available A new approach for programming that enables switching among contexts of commands during program execution is context-oriented programming (COP. This technique is more structured and modular than object-oriented and aspect-oriented programming and hence more flexible. For context-oriented programming, as implemented in COP languages such as ContextJ* and ContextL, this paper introduces accurate operational semantics. The language model of this paper uses Java concepts and is equipped with layer techniques for activation/deactivation of layer contexts. This paper also presents a logical system for COP programs. This logic is necessary for the automation of testing, developing, and validating of partial correctness specifications for COP programs and is an extension of separation logic. A mathematical soundness proof for the logical system against the proposed operational semantics is presented in the paper.

  13. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  14. Superconducting magnetic separation of ground steel slag powder for recovery of resources

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, H. W.; Kim, J. J.; Kim, Young Hun [Andong National University, Andong (Korea, Republic of); Ha, D. W. [Korea Electrotechnology Research Institute, Changwon (Korea, Republic of); Choi, J. H. [Dept. of Environmental Engineering, Catholic University of Pusan, Pusan (Korea, Republic of)

    2017-03-15

    Steel slag has been considered as an industrial waste. A huge amount of slag is produced as a byproduct and the steel slag usually has been dumped in a landfill site. However the steel slag contains valuable resources such as iron, copper, manganese, and magnesium. Superconducting magnetic separation has been applied on recovery of the valuable resources from the steel slag and this process also has intended to reduce the waste to be dumped. Cryo-cooled Nb-Ti superconducting magnet with 100 mm bore and 600 mm of height was used as the magnetic separator. The separating efficiency was evaluated in the function of magnetic field. A steel slag was ground and analyzed for the composition. Iron containing minerals were successfully concentrated from less iron containing portion. The separation efficiency was highly dependent on the particle size giving higher separating efficiency with finer particle. The magnetic field also effects on the separation ratio. Current study showed that an appropriate grinding of slag and magnetic separation lead to the recovery of metal resources from steel slag waste rather than dumping all of the volume.

  15. Automation for a base station stability testing

    OpenAIRE

    Punnek, Elvis

    2016-01-01

    This Batchelor’s thesis was commissioned by Oy LM Ericsson Ab Oulu. The aim of it was to help to investigate and create a test automation solution for the stability testing of the LTE base station. The main objective was to create a test automation for a predefined test set. This test automation solution had to be created for specific environments and equipment. This work included creating the automation for the test cases and putting them to daily test automation jobs. The key factor...

  16. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  17. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  18. Automated Breast Ultrasound for Ductal Pattern Reconstruction: Ground Truth File Generation and CADe Evaluation

    Science.gov (United States)

    Manousaki, D.; Panagiotopoulou, A.; Bizimi, V.; Haynes, M. S.; Love, S.; Kallergi, M.

    2017-11-01

    The purpose of this study was the generation of ground truth files (GTFs) of the breast ducts from 3D images of the Invenia™ Automated Breast Ultrasound System (ABUS) system (GE Healthcare, Little Chalfont, UK) and the application of these GTFs for the optimization of the imaging protocol and the evaluation of a computer aided detection (CADe) algorithm developed for automated duct detection. Six lactating, nursing volunteers were scanned with the ABUS before and right after breastfeeding their infants. An expert in breast ultrasound generated rough outlines of the milk-filled ducts in the transaxial slices of all image volumes and the final GTFs were created by using thresholding and smoothing tools in ImageJ. In addition, a CADe algorithm automatically segmented duct like areas and its results were compared to the expert’s GTFs by estimating true positive fraction (TPF) or % overlap. The CADe output differed significantly from the expert’s but both detected a smaller than expected volume of the ducts due to insufficient contrast (ducts were partially filled with milk), discontinuities, and artifacts. GTFs were used to modify the imaging protocol and improve the CADe method. In conclusion, electronic GTFs provide a valuable tool in the optimization of a tomographic imaging system, the imaging protocol, and the CADe algorithms. Their generation, however, is an extremely time consuming, strenuous process, particularly for multi-slice examinations, and alternatives based on phantoms or simulations are highly desirable.

  19. Automation of experiments at Dubna Gas-Filled Recoil Separator

    Science.gov (United States)

    Tsyganov, Yu. S.

    2016-01-01

    Approaches to solving the problems of automation of basic processes in long-term experiments in heavy ion beams of the Dubna Gas-Filled Recoil Separator (DGFRS) facility are considered. Approaches in the field of spectrometry, both of rare α decays of superheavy nuclei and those for constructing monitoring systems to provide accident-free experiment running with highly radioactive targets and recording basic parameters of experiment, are described. The specific features of Double Side Silicon Strip Detectors (DSSSDs) are considered, special attention is paid to the role of boundary effects of neighboring p-n transitions in the "active correlations" method. An example of an off-beam experiment attempting to observe Zeno effect is briefly considered. Basic examples for nuclear reactions of complete fusion at 48Ca ion beams of U-400 cyclotron (LNR, JINR) are given. A scenario of development of the "active correlations" method for the case of very high intensity beams of heavy ions at promising accelerators of LNR, JINR, is presented.

  20. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  1. Development of a portable, automated system for the separation of radionuclides by column chromatography

    International Nuclear Information System (INIS)

    Schumacher, C.; Burow, M.; Flucht, R.; Hill, P.; Zoriy, M.V.

    2012-01-01

    The determination of the chemical recovery is one of the most important challenges in the radiochemical analysis. Small changes at the pH-value and temperature changes lead to uncontrollable conditions in process. To improve the reproducibility of the chemical determination a separate separating column system (TSM) was developed at the analytic laboratory at JUeLICH. Using the TSM it is possible to separate nuclides by applying variable eluents and ion exchangers also in samples with a very high salt content like urine. Up to now the methods are developed for the elements U, Am and Pu. The automation provides a bigger number of analysed samples per working day and a remarkable economy of time. Due to the increasing of the sample volume it is possible to improve the detection limit of overall analytical procedure (time needed for separation increases). Experimental parameters like rate of flow and chemical recovery were tested. In this progress it was tried to develop a dense portable system which is easy to use. This new TSM allows a realisation of various separating processes by the easy handling via laptop. (orig.)

  2. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    Directory of Open Access Journals (Sweden)

    Hwisoo Eom

    2015-06-01

    Full Text Available A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  3. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    Science.gov (United States)

    Eom, Hwisoo; Lee, Sang Hun

    2015-06-12

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  4. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  5. Ground effects on the stability of separated flow around an airfoil at low Reynolds numbers

    Science.gov (United States)

    He, Wei; Yu, Peng; Li, Larry K. B.

    2017-11-01

    We perform a BiGlobal stability analysis on the separated flow around a NACA 4415 airfoil at low Reynolds numbers (Re = 300 - 1000) and a high angle of attack α =20° with a focus on the effect of the airfoil's proximity to a moving ground. The results show that the most dominant perturbation is the Kelvin-Helmholtz mode and that this traveling mode becomes less unstable as the airfoil approaches the ground, although this stabilizing effect diminishes with increasing Reynolds number. By performing a Floquet analysis, we find that this ground effect can also stabilize secondary instabilities. This numerical-theoretical study shows that the ground can have a significant influence on the stability of separated flow around an airfoil at low Reynolds numbers, which could have implications for the design of micro aerial vehicles and for the understanding of natural flyers such as insects and birds. This work was supported by the Research Grants Council of Hong Kong (Project No. 16235716 and 26202815) and the Special Program for Applied Research on Super Computation of the NSFC-Guangdong Joint Fund (the second phase) under Grant No.U1501501.

  6. Direct current insulator based dielectrophoresis (DC-iDEP) microfluidic chip for blood plasma separation

    OpenAIRE

    Mohammadi, Mahdi

    2015-01-01

    Lab-on-a-Chip (LOC) integrated microfluidics has been a powerful tool for new developments in analytical chemistry. These microfluidic systems enable the miniaturization, integration and automation of complex biochemical assays through the reduction of reagent use and enabling portability.Cell and particle separation in microfluidic systems has recently gained significant attention in many sample preparations for clinical procedures. Direct-current insulator-based dielectrophoresis (DC-iDEP) ...

  7. Evaluating remotely sensed plant count accuracy with differing unmanned aircraft system altitudes, physical canopy separations, and ground covers

    Science.gov (United States)

    Leiva, Josue Nahun; Robbins, James; Saraswat, Dharmendra; She, Ying; Ehsani, Reza

    2017-07-01

    This study evaluated the effect of flight altitude and canopy separation of container-grown Fire Chief™ arborvitae (Thuja occidentalis L.) on counting accuracy. Images were taken at 6, 12, and 22 m above the ground using unmanned aircraft systems. Plants were spaced to achieve three canopy separation treatments: 5 cm between canopy edges, canopy edges touching, and 5 cm of canopy edge overlap. Plants were placed on two different ground covers: black fabric and gravel. A counting algorithm was trained using Feature Analyst®. Total counting error, false positives, and unidentified plants were reported for images analyzed. In general, total counting error was smaller when plants were fully separated. The effect of ground cover on counting accuracy varied with the counting algorithm. Total counting error for plants placed on gravel (-8) was larger than for those on a black fabric (-2), however, false positive counts were similar for black fabric (6) and gravel (6). Nevertheless, output images of plants placed on gravel did not show a negative effect due to the ground cover but was impacted by differences in image spatial resolution.

  8. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  9. Role of automation in the ACRV operations

    Science.gov (United States)

    Sepahban, S. F.

    1992-01-01

    The Assured Crew Return Vehicle (ACRV) will provide the Space Station Freedom with contingency means of return to earth (1) of one disabled crew member during medical emergencies, (2) of all crew members in case of accidents or failures of SSF systems, and (3) in case of interruption of the Space Shuttle flights. A wide range of vehicle configurations and system approaches are currently under study. The Program requirements focus on minimizing life cycle costs by ensuring simple operations, built-in reliability and maintainability. The ACRV philosophy of embedded operations is based on maximum use of existing facilities, resources and processes, while minimizing the interfaces and impacts to the Space Shuttle and Freedom programs. A preliminary integrated operations concept based on this philosophy and covering the ground, flight, mission support, and landing and recovery operations has been produced. To implement the ACRV operations concept, the underlying approach has been to rely on vehicle autonomy and automation, to the extent possible. Candidate functions and processes which may benefit from current or near-term automation and robotics technologies are identified. These include, but are not limited to, built-in automated ground tests and checkouts; use of the Freedom and the Orbiter remote manipulator systems, for ACRV berthing; automated passive monitoring and performance trend analysis, and periodic active checkouts during dormant periods. The major ACRV operations concept issues as they relate to the use of automation are discussed.

  10. Fully automated dissolution and separation methods for inductively coupled plasma atomic emission spectrometry rock analysis. Application to the determination of rare earth elements

    International Nuclear Information System (INIS)

    Govindaraju, K.; Mevelle, G.

    1987-01-01

    In rock analysis laboratories, sample preparation is a serious problem, or even an enormous bottleneck. Because this laboratory is production-oriented, this problem was attacked by automating progressively, different steps in rock analysis for major, minor and trace elements. This effort has been considerably eased by the fact that all sample preparation schemes in this laboratory for the past three decades have been based on an initial lithium borate fusion of rock samples and all analytical methods based on multi-element atomic emission spectrometry, with switch-over from solid analysis by arc/spark excitation to solution analysis by plasma excitation in 1974. The sample preparation steps which have been automated are: weighing of samples and fluxes, lithium borate fusion, dissolution and dilution of fusion products and ion-exchange separation of difficult trace elements such as rare earth elements (REE). During 1985 and 1986, these different unit operations have been assembled together as peripheral units in the form of a workstation, called LabRobStation. A travelling robot is the master of LabRobStation, with all peripheral units at its reach in 10 m 2 workspace. As an example of real application, the automated determination of REE, based on more than 8000 samples analysed during 1982 and 1986, is presented. (author)

  11. A Comparison Between Orion Automated and Space Shuttle Rendezvous Techniques

    Science.gov (United States)

    Ruiz, Jose O,; Hart, Jeremy

    2010-01-01

    The Orion spacecraft will replace the space shuttle and will be the first human spacecraft since the Apollo program to leave low earth orbit. This vehicle will serve as the cornerstone of a complete space transportation system with a myriad of mission requirements necessitating rendezvous to multiple vehicles in earth orbit, around the moon and eventually beyond . These goals will require a complex and robust vehicle that is, significantly different from both the space shuttle and the command module of the Apollo program. Historically, orbit operations have been accomplished with heavy reliance on ground support and manual crew reconfiguration and monitoring. One major difference with Orion is that automation will be incorporated as a key element of the man-vehicle system. The automated system will consist of software devoted to transitioning between events based on a master timeline. This effectively adds a layer of high level sequencing that moves control of the vehicle from one phase to the next. This type of automated control is not entirely new to spacecraft since the shuttle uses a version of this during ascent and entry operations. During shuttle orbit operations however many of the software modes and hardware switches must be manually configured through the use of printed procedures and instructions voiced from the ground. The goal of the automation scheme on Orion is to extend high level automation to all flight phases. The move towards automation represents a large shift from current space shuttle operations, and so these new systems will be adopted gradually via various safeguards. These include features such as authority-to-proceed, manual down modes, and functional inhibits. This paper describes the contrast between the manual and ground approach of the space shuttle and the proposed automation of the Orion vehicle. I will introduce typical orbit operations that are common to all rendezvous missions and go on to describe the current Orion automation

  12. A Method of Separation Assurance for Instrument Flight Procedures at Non-Radar Airports

    Science.gov (United States)

    Conway, Sheila R.; Consiglio, Maria

    2002-01-01

    A method to provide automated air traffic separation assurance services during approach to or departure from a non-radar, non-towered airport environment is described. The method is constrained by provision of these services without radical changes or ambitious investments in current ground-based technologies. The proposed procedures are designed to grant access to a large number of airfields that currently have no or very limited access under Instrument Flight Rules (IFR), thus increasing mobility with minimal infrastructure investment. This paper primarily addresses a low-cost option for airport and instrument approach infrastructure, but is designed to be an architecture from which a more efficient, albeit more complex, system may be developed. A functional description of the capabilities in the current NAS infrastructure is provided. Automated terminal operations and procedures are introduced. Rules of engagement and the operations are defined. Results of preliminary simulation testing are presented. Finally, application of the method to more terminal-like operations, and major research areas, including necessary piloted studies, are discussed.

  13. Automated audiometry using apple iOS-based application technology.

    Science.gov (United States)

    Foulad, Allen; Bui, Peggy; Djalilian, Hamid

    2013-11-01

    The aim of this study is to determine the feasibility of an Apple iOS-based automated hearing testing application and to compare its accuracy with conventional audiometry. Prospective diagnostic study. Setting Academic medical center. An iOS-based software application was developed to perform automated pure-tone hearing testing on the iPhone, iPod touch, and iPad. To assess for device variations and compatibility, preliminary work was performed to compare the standardized sound output (dB) of various Apple device and headset combinations. Forty-two subjects underwent automated iOS-based hearing testing in a sound booth, automated iOS-based hearing testing in a quiet room, and conventional manual audiometry. The maximum difference in sound intensity between various Apple device and headset combinations was 4 dB. On average, 96% (95% confidence interval [CI], 91%-100%) of the threshold values obtained using the automated test in a sound booth were within 10 dB of the corresponding threshold values obtained using conventional audiometry. When the automated test was performed in a quiet room, 94% (95% CI, 87%-100%) of the threshold values were within 10 dB of the threshold values obtained using conventional audiometry. Under standardized testing conditions, 90% of the subjects preferred iOS-based audiometry as opposed to conventional audiometry. Apple iOS-based devices provide a platform for automated air conduction audiometry without requiring extra equipment and yield hearing test results that approach those of conventional audiometry.

  14. Knowledge-based automated radiopharmaceutical manufacturing for Positron Emission Tomography

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1991-01-01

    This article describes the application of basic knowledge engineering principles to the design of automated synthesis equipment for radiopharmaceuticals used in Positron Emission Tomography (PET). Before discussing knowledge programming, an overview of the development of automated radiopharmaceutical synthesis systems for PET will be presented. Since knowledge systems will rely on information obtained from machine transducers, a discussion of the uses of sensory feedback in today's automated systems follows. Next, the operation of these automated systems is contrasted to radiotracer production carried out by chemists, and the rationale for and basic concepts of knowledge-based programming are explained. Finally, a prototype knowledge-based system supporting automated radiopharmaceutical manufacturing of 18FDG at Brookhaven National Laboratory (BNL) is described using 1stClass, a commercially available PC-based expert system shell

  15. Network-based automation for SMEs

    DEFF Research Database (Denmark)

    Parizi, Mohammad Shahabeddini; Radziwon, Agnieszka

    2017-01-01

    The implementation of appropriate automation concepts which increase productivity in Small and Medium Sized Enterprises (SMEs) requires a lot of effort, due to their limited resources. Therefore, it is strongly recommended for small firms to open up for the external sources of knowledge, which...... could be obtained through network interaction. Based on two extreme cases of SMEs representing low-tech industry and an in-depth analysis of their manufacturing facilities this paper presents how collaboration between firms embedded in a regional ecosystem could result in implementation of new...... with other members of the same regional ecosystem. The findings highlight two main automation related areas where manufacturing SMEs could leverage on external sources on knowledge – these are assistance in defining automation problem as well as appropriate solution and provider selection. Consequently...

  16. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Xiong, D

    2001-01-01

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  17. Space debris removal using a high-power ground-based laser

    Energy Technology Data Exchange (ETDEWEB)

    Monroe, D.K.

    1993-12-31

    The feasibility and practicality of using a ground-based laser (GBL) to remove artificial space debris is examined. Physical constraints indicate that a reactor-pumped laser (RPL) may be best suited for this mission, because of its capabilities for multimegawatt output long run-times, and near-diffraction-limited initial beams. Simulations of a laser-powered debris removal system indicate that a 5-MW RPL with a 10-meter-diameter beam director and adaptive optics capabilities can deorbit 1-kg debris from space station altitudes. Larger debris can be deorbited or transferred to safer orbits after multiple laser engagements. A ground-based laser system may be the only realistic way to access and remove some 10,000 separate objects, having velocities in the neighborhood of 7 km/sec, and being spatially distributed over some 10{sup 10} km{sup 3} of space.

  18. Recent advances in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Fujita, Katsuhide; Robu, Valentin

    2016-01-01

    This book covers recent advances in Complex Automated Negotiations as a widely studied emerging area in the field of Autonomous Agents and Multi-Agent Systems. The book includes selected revised and extended papers from the 7th International Workshop on Agent-Based Complex Automated Negotiation (ACAN2014), which was held in Paris, France, in May 2014. The book also includes brief introductions about Agent-based Complex Automated Negotiation which are based on tutorials provided in the workshop, and brief summaries and descriptions about the ANAC'14 (Automated Negotiating Agents Competition) competition, where authors of selected finalist agents explain the strategies and the ideas used by them. The book is targeted to academic and industrial researchers in various communities of autonomous agents and multi-agent systems, such as agreement technology, mechanism design, electronic commerce, related areas, as well as graduate, undergraduate, and PhD students working in those areas or having interest in them.

  19. Ground-based photo monitoring

    Science.gov (United States)

    Frederick C. Hall

    2000-01-01

    Ground-based photo monitoring is repeat photography using ground-based cameras to document change in vegetation or soil. Assume those installing the photo location will not be the ones re-photographing it. This requires a protocol that includes: (1) a map to locate the monitoring area, (2) another map diagramming the photographic layout, (3) type and make of film such...

  20. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  1. (Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    An environmental investigation of ground water conditions has been undertaken at Wright-Patterson Air Force Base (WPAFB), Ohio to obtain data to assist in the evaluation of a potential removal action to prevent, to the extent practicable, migration of the contaminated ground water across Base boundaries. Field investigations were limited to the central section of the southwestern boundary of Area C and the Springfield Pike boundary of Area B. Further, the study was limited to a maximum depth of 150 feet below grade. Three primary activities of the field investigation were: (1) installation of 22 monitoring wells, (2) collection and analysis of ground water from 71 locations, (3) measurement of ground water elevations at 69 locations. Volatile organic compounds including trichloroethylene, perchloroethylene, and/or vinyl chloride were detected in concentrations exceeding Maximum Contaminant Levels (MCL) at three locations within the Area C investigation area. Ground water at the Springfield Pike boundary of Area B occurs in two primary units, separated by a thicker-than-expected clay layers. One well within Area B was determined to exceed the MCL for trichloroethylene.

  2. Acoustofluidic bacteria separation

    International Nuclear Information System (INIS)

    Li, Sixing; Huang, Tony Jun; Ma, Fen; Zeng, Xiangqun; Bachman, Hunter; Cameron, Craig E

    2017-01-01

    Bacterial separation from human blood samples can help with the identification of pathogenic bacteria for sepsis diagnosis. In this work, we report an acoustofluidic device for label-free bacterial separation from human blood samples. In particular, we exploit the acoustic radiation force generated from a tilted-angle standing surface acoustic wave (taSSAW) field to separate Escherichia coli from human blood cells based on their size difference. Flow cytometry analysis of the E. coli separated from red blood cells shows a purity of more than 96%. Moreover, the label-free electrochemical detection of the separated E. coli displays reduced non-specific signals due to the removal of blood cells. Our acoustofluidic bacterial separation platform has advantages such as label-free separation, high biocompatibility, flexibility, low cost, miniaturization, automation, and ease of in-line integration. The platform can be incorporated with an on-chip sensor to realize a point-of-care sepsis diagnostic device. (paper)

  3. Acoustofluidic bacteria separation

    Science.gov (United States)

    Li, Sixing; Ma, Fen; Bachman, Hunter; Cameron, Craig E.; Zeng, Xiangqun; Huang, Tony Jun

    2017-01-01

    Bacterial separation from human blood samples can help with the identification of pathogenic bacteria for sepsis diagnosis. In this work, we report an acoustofluidic device for label-free bacterial separation from human blood samples. In particular, we exploit the acoustic radiation force generated from a tilted-angle standing surface acoustic wave (taSSAW) field to separate Escherichia coli from human blood cells based on their size difference. Flow cytometry analysis of the E. coli separated from red blood cells shows a purity of more than 96%. Moreover, the label-free electrochemical detection of the separated E. coli displays reduced non-specific signals due to the removal of blood cells. Our acoustofluidic bacterial separation platform has advantages such as label-free separation, high biocompatibility, flexibility, low cost, miniaturization, automation, and ease of in-line integration. The platform can be incorporated with an on-chip sensor to realize a point-of-care sepsis diagnostic device.

  4. Polyvinyl-alcohol-based magnetic beads for rapid and efficient separation of specific or unspecific nucleic acid sequences

    International Nuclear Information System (INIS)

    Oster, J.; Parker, Jeffrey; Brassard, Lothar

    2001-01-01

    The versatile application of polyvinyl-alcohol-based magnetic M-PVA beads is demonstrated in the separation of genomic DNA, sequence specific nucleic acid purification, and binding of bacteria for subsequent DNA extraction and detection. It is shown that nucleic acids can be obtained in high yield and purity using M-PVA beads, making sample preparation efficient, fast and highly adaptable for automation processes

  5. A μp based automation system for Raman and Rayleigh spectrometers

    International Nuclear Information System (INIS)

    Kesavamoorthy, R.; Arora, A.K.; Vasumathi, D.

    1988-01-01

    μp based data acquisition cum automation system for Raman and Rayleigh Spectrometers is described. The experiments require simultaneous acquisition of different digital data in two separate counters, their storage and rotation of grating through stepper motor in a repetitive cycle. Various modes of operation are selected through a function keyboard. The current status of the experiment is also displayed using 7 segment 12 element display unit. The input parameters are fed through a hexadecimal keyboard before the start of the experiment. The stored data can be send to a printer/terminal or to a PC through a serial port after the completion of the experiment. (author)

  6. FPGA Accelerator for Wavelet-Based Automated Global Image Registration

    Directory of Open Access Journals (Sweden)

    Baofeng Li

    2009-01-01

    Full Text Available Wavelet-based automated global image registration (WAGIR is fundamental for most remote sensing image processing algorithms and extremely computation-intensive. With more and more algorithms migrating from ground computing to onboard computing, an efficient dedicated architecture of WAGIR is desired. In this paper, a BWAGIR architecture is proposed based on a block resampling scheme. BWAGIR achieves a significant performance by pipelining computational logics, parallelizing the resampling process and the calculation of correlation coefficient and parallel memory access. A proof-of-concept implementation with 1 BWAGIR processing unit of the architecture performs at least 7.4X faster than the CL cluster system with 1 node, and at least 3.4X than the MPM massively parallel machine with 1 node. Further speedup can be achieved by parallelizing multiple BWAGIR units. The architecture with 5 units achieves a speedup of about 3X against the CL with 16 nodes and a comparative speed with the MPM with 30 nodes. More importantly, the BWAGIR architecture can be deployed onboard economically.

  7. FPGA Accelerator for Wavelet-Based Automated Global Image Registration

    Directory of Open Access Journals (Sweden)

    Li Baofeng

    2009-01-01

    Full Text Available Abstract Wavelet-based automated global image registration (WAGIR is fundamental for most remote sensing image processing algorithms and extremely computation-intensive. With more and more algorithms migrating from ground computing to onboard computing, an efficient dedicated architecture of WAGIR is desired. In this paper, a BWAGIR architecture is proposed based on a block resampling scheme. BWAGIR achieves a significant performance by pipelining computational logics, parallelizing the resampling process and the calculation of correlation coefficient and parallel memory access. A proof-of-concept implementation with 1 BWAGIR processing unit of the architecture performs at least 7.4X faster than the CL cluster system with 1 node, and at least 3.4X than the MPM massively parallel machine with 1 node. Further speedup can be achieved by parallelizing multiple BWAGIR units. The architecture with 5 units achieves a speedup of about 3X against the CL with 16 nodes and a comparative speed with the MPM with 30 nodes. More importantly, the BWAGIR architecture can be deployed onboard economically.

  8. Operational Based Vision Assessment Automated Vision Test Collection User Guide

    Science.gov (United States)

    2017-05-15

    AFRL-SA-WP-SR-2017-0012 Operational Based Vision Assessment Automated Vision Test Collection User Guide Elizabeth Shoda, Alex...June 2015 – May 2017 4. TITLE AND SUBTITLE Operational Based Vision Assessment Automated Vision Test Collection User Guide 5a. CONTRACT NUMBER... automated vision tests , or AVT. Development of the AVT was required to support threshold-level vision testing capability needed to investigate the

  9. Extended automated separation techniques in destructive neutron activation analysis; application to various biological materials, including human tissues and blood

    International Nuclear Information System (INIS)

    Tjioe, P.S.; Goeij, J.J.M. de; Houtman, J.P.W.

    1976-09-01

    Neutron activation analysis may be performed as a multi-element and low-level technique for many important trace elements in biological materials, provided that post-irradiation chemical separations are applied. This paper describes a chemical separation consisting of automated procedures for destruction, distillation, and anion-chromatography. The system developed enables the determination of 14 trace elements in biological materials, viz. antimony, arsenic, bromine, cadmium, chromium, cobalt, copper, gold, iron, mercury, molybdenum, nickel, selenium, and zinc. The aspects of sample preparation, neutron irradiation, gamma-spectrum evaluation, and blank-value contribution are also discussed

  10. Operator-based metric for nuclear operations automation assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zacharias, G.L.; Miao, A.X.; Kalkan, A. [Charles River Analytics Inc., Cambridge, MA (United States)] [and others

    1995-04-01

    Continuing advances in real-time computational capabilities will support enhanced levels of smart automation and AI-based decision-aiding systems in the nuclear power plant (NPP) control room of the future. To support development of these aids, we describe in this paper a research tool, and more specifically, a quantitative metric, to assess the impact of proposed automation/aiding concepts in a manner that can account for a number of interlinked factors in the control room environment. In particular, we describe a cognitive operator/plant model that serves as a framework for integrating the operator`s information-processing capabilities with his procedural knowledge, to provide insight as to how situations are assessed by the operator, decisions made, procedures executed, and communications conducted. Our focus is on the situation assessment (SA) behavior of the operator, the development of a quantitative metric reflecting overall operator awareness, and the use of this metric in evaluating automation/aiding options. We describe the results of a model-based simulation of a selected emergency scenario, and metric-based evaluation of a range of contemplated NPP control room automation/aiding options. The results demonstrate the feasibility of model-based analysis of contemplated control room enhancements, and highlight the need for empirical validation.

  11. Automated Ground Penetrating Radar hyperbola detection in complex environment

    Science.gov (United States)

    Mertens, Laurence; Lambot, Sébastien

    2015-04-01

    Ground Penetrating Radar (GPR) systems are commonly used in many applications to detect, amongst others, buried targets (various types of pipes, landmines, tree roots ...), which, in a cross-section, present theoretically a particular hyperbolic-shaped signature resulting from the antenna radiation pattern. Considering the large quantity of information we can acquire during a field campaign, a manual detection of these hyperbolas is barely possible, therefore we have a real need to have at our disposal a quick and automated detection of these hyperbolas. However, this task may reveal itself laborious in real field data because these hyperbolas are often ill-shaped due to the heterogeneity of the medium and to instrumentation clutter. We propose a new detection algorithm for well- and ill-shaped GPR reflection hyperbolas especially developed for complex field data. This algorithm is based on human recognition pattern to emulate human expertise to identify the hyperbolas apexes. The main principle relies in a fitting process of the GPR image edge dots detected with Canny filter to analytical hyperbolas, considering the object as a punctual disturbance with a physical constraint of the parameters. A long phase of observation of a large number of ill-shaped hyperbolas in various complex media led to the definition of smart criteria characterizing the hyperbolic shape and to the choice of accepted value ranges acceptable for an edge dot to correspond to the apex of a specific hyperbola. These values were defined to fit the ambiguity zone for the human brain and present the particularity of being functional in most heterogeneous media. Furthermore, the irregularity is particularly taken into account by defining a buffer zone around the theoretical hyperbola in which the edge dots need to be encountered to belong to this specific hyperbola. First, the method was tested in laboratory conditions over tree roots and over PVC pipes with both time- and frequency-domain radars

  12. Texture segregation, surface representation and figure-ground separation.

    Science.gov (United States)

    Grossberg, S; Pessoa, L

    1998-09-01

    A widespread view is that most texture segregation can be accounted for by differences in the spatial frequency content of texture regions. Evidence from both psychophysical and physiological studies indicate, however, that beyond these early filtering stages, there are stages of 3-D boundary segmentation and surface representation that are used to segregate textures. Chromatic segregation of element-arrangement patterns--as studied by Beck and colleagues--cannot be completely explained by the filtering mechanisms previously employed to account for achromatic segregation. An element arrangement pattern is composed of two types of elements that are arranged differently in different image regions (e.g. vertically on top and diagonally on the bottom). FACADE theory mechanisms that have previously been used to explain data about 3-D vision and figure-ground separation are here used to simulate chromatic texture segregation data, including data with equiluminant elements on dark or light homogeneous backgrounds, or backgrounds composed of vertical and horizontal dark or light stripes, or horizontal notched stripes. These data include the fact that segregation of patterns composed of red and blue squares decreases with increasing luminance of the interspaces. Asymmetric segregation properties under 3-D viewing conditions with the equiluminant elements close or far are also simulated. Two key model properties are a spatial impenetrability property that inhibits boundary grouping across regions with non-collinear texture elements and a boundary-surface consistency property that uses feedback between boundary and surface representations to eliminate spurious boundary groupings and separate figures from their backgrounds.

  13. Next frontier in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Robu, Valentin

    2015-01-01

    This book focuses on automated negotiations based on multi-agent systems. It is intended for researchers and students in various fields involving autonomous agents and multi-agent systems, such as e-commerce tools, decision-making and negotiation support systems, and collaboration tools. The contents will help them to understand the concept of automated negotiations, negotiation protocols, negotiating agents’ strategies, and the applications of those strategies. In this book, some negotiation protocols focusing on the multiple interdependent issues in negotiations are presented, making it possible to find high-quality solutions for the complex agents’ utility functions. This book is a compilation of the extended versions of the very best papers selected from the many that were presented at the International Workshop on Agent-Based Complex Automated Negotiations.

  14. Automated solid-phase subcloning based on beads brought into proximity by magnetic force.

    Science.gov (United States)

    Hudson, Elton P; Nikoshkov, Andrej; Uhlen, Mathias; Rockberg, Johan

    2012-01-01

    In the fields of proteomics, metabolic engineering and synthetic biology there is a need for high-throughput and reliable cloning methods to facilitate construction of expression vectors and genetic pathways. Here, we describe a new approach for solid-phase cloning in which both the vector and the gene are immobilized to separate paramagnetic beads and brought into proximity by magnetic force. Ligation events were directly evaluated using fluorescent-based microscopy and flow cytometry. The highest ligation efficiencies were obtained when gene- and vector-coated beads were brought into close contact by application of a magnet during the ligation step. An automated procedure was developed using a laboratory workstation to transfer genes into various expression vectors and more than 95% correct clones were obtained in a number of various applications. The method presented here is suitable for efficient subcloning in an automated manner to rapidly generate a large number of gene constructs in various vectors intended for high throughput applications.

  15. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  16. Partitioning the LIS/OTD Lightning Climatological Dataset into Separate Ground and Cloud Flash Distributions

    Science.gov (United States)

    Koshak, W. J.; Solarkiewicz, R. J.

    2009-01-01

    Presently, it is not well understood how to best model nitrogen oxides (NOx) emissions from lightning because lightning is highly variable. Peak current, channel length, channel altitude, stroke multiplicity, and the number of flashes that occur in a particular region (i.e., flash density) all influence the amount of lightning NOx produced. Moreover, these 5 variables are not the same for ground and cloud flashes; e.g., cloud flashes normally have lower peak currents, higher altitudes, and higher flash densities than ground flashes [see (Koshak, 2009) for additional details]. Because the existing satellite observations of lightning (Fig. 1) from the Lightning Imaging Sensor/Optical Transient Detector (LIS/OTD) do not distinguish between ground and cloud fashes, which produce different amounts of NOx, it is very difficult to accurately account for the regional/global production of lightning NOx. Hence, the ability to partition the LIS/OTD lightning climatology into separate ground and cloud flash distributions would substantially benefit the atmospheric chemistry modeling community. NOx indirectly influences climate because it controls the concentration of ozone and hydroxyl radicals in the atmosphere. The importance of lightning-produced NOx is empasized throughout the scientific literature (see for example, Huntrieser et al. 1998). In fact, lightning is the most important NOx source in the upper troposphere with a global production rate estimated to vary between 2 and 20 Tg (N)yr(sup -1) (Lee et al., 1997), with more recent estimates of about 6 Tg(N)yr(sup -1) (Martin et al., 2007). In order to make accurate predictions, global chemistry/climate models (as well as regional air quality modells) must more accurately account for the effects of lightning NOx. In particular, the NASA Goddard Institute for Space Studies (GISS) Model E (Schmidt et al., 2005) and the GEOS-CHEM global chemical transport model (Bey et al., 2001) would each benefit from a partitioning of the

  17. Automated Segmentation of High-Resolution Photospheric Images of Active Regions

    Science.gov (United States)

    Yang, Meng; Tian, Yu; Rao, Changhui

    2018-02-01

    Due to the development of ground-based, large-aperture solar telescopes with adaptive optics (AO) resulting in increasing resolving ability, more accurate sunspot identifications and characterizations are required. In this article, we have developed a set of automated segmentation methods for high-resolution solar photospheric images. Firstly, a local-intensity-clustering level-set method is applied to roughly separate solar granulation and sunspots. Then reinitialization-free level-set evolution is adopted to adjust the boundaries of the photospheric patch; an adaptive intensity threshold is used to discriminate between umbra and penumbra; light bridges are selected according to their regional properties from candidates produced by morphological operations. The proposed method is applied to the solar high-resolution TiO 705.7-nm images taken by the 151-element AO system and Ground-Layer Adaptive Optics prototype system at the 1-m New Vacuum Solar Telescope of the Yunnan Observatory. Experimental results show that the method achieves satisfactory robustness and efficiency with low computational cost on high-resolution images. The method could also be applied to full-disk images, and the calculated sunspot areas correlate well with the data given by the National Oceanic and Atmospheric Administration (NOAA).

  18. The effects of automated artifact removal algorithms on electroencephalography-based Alzheimer’s disease diagnosis

    Directory of Open Access Journals (Sweden)

    Raymundo eCassani

    2014-03-01

    Full Text Available Over the last decade, electroencephalography (EEG has emerged as a reliable tool for the diagnosis of cortical disorders such as Alzheimer's disease (AD. EEG signals, however, are susceptible to several artifacts, such as ocular, muscular, movement, and environmental. To overcome this limitation, existing diagnostic systems commonly depend on experienced clinicians to manually select artifact-free epochs from the collected multi-channel EEG data. Manual selection, however, is a tedious and time-consuming process, rendering the diagnostic system ``semi-automated. Notwithstanding, a number of EEG artifact removal algorithms have been proposed in the literature. The (disadvantages of using such algorithms in automated AD diagnostic systems, however, have not been documented; this paper aims to fill this gap. Here, we investigate the effects of three state-of-the-art automated artifact removal (AAR algorithms (both alone and in combination with each other on AD diagnostic systems based on four different classes of EEG features, namely, spectral, amplitude modulation rate of change, coherence, and phase. The three AAR algorithms tested are statistical artifact rejection (SAR, blind source separation based on second order blind identification and canonical correlation analysis (BSS-SOBI-CCA, and wavelet enhanced independent component analysis (wICA. Experimental results based on 20-channel resting-awake EEG data collected from 59 participants (20 patients with mild AD, 15 with moderate-to-severe AD, and 24 age-matched healthy controls showed the wICA algorithm alone outperforming other enhancement algorithm combinations across three tasks: diagnosis (control vs. mild vs. moderate, early detection (control vs. mild, and disease progression (mild vs. moderate, thus opening the doors for fully-automated systems that can assist clinicians with early detection of AD, as well as disease severity progression assessment.

  19. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  20. Investigating the Complexity of Transitioning Separation Assurance Tools into NextGen Air Traffic Control

    Science.gov (United States)

    Gomez, Ashley Nicole; Martin, Lynne Hazel; Homola, Jeffrey; Morey, Susan; Cabrall, Christopher; Mercer, Joey; Prevot, Thomas

    2013-01-01

    In a study, that introduced ground-based separation assurance automation through a series of envisioned transitional phases of concept maturity, it was found that subjective responses to scales of workload, situation awareness, and acceptability in a post run questionnaire revealed as-predicted results for three of the four study conditions but not for the third, Moderate condition. The trend continued for losses of separation (LOS) where the number of LOS events were far greater than expected in the Moderate condition. To offer an account of why the Moderate condition was perceived to be more difficult to manage than predicted, researchers examined the increase in amount and complexity of traffic, increase in communication load, and increased complexities as a result of the simulation's mix of aircraft equipage. Further analysis compared the tools presented through the phases, finding that controllers took advantage of the informational properties of the tools presented but shied away from using their decision support capabilities. Taking into account similar findings from other studies, it is suggested that the Moderate condition represented the first step into a "shared control" environment, which requires the controller to use the automation as a decision making partner rather than just a provider of information. Viewed in this light, the combination of tools offered in the Moderate condition was reviewed and some tradeoffs that may offset the identified complexities were suggested.

  1. Vision-based guidance for an automated roving vehicle

    Science.gov (United States)

    Griffin, M. D.; Cunningham, R. T.; Eskenazi, R.

    1978-01-01

    A controller designed to guide an automated vehicle to a specified target without external intervention is described. The intended application is to the requirements of planetary exploration, where substantial autonomy is required because of the prohibitive time lags associated with closed-loop ground control. The guidance algorithm consists of a set of piecewise-linear control laws for velocity and steering commands, and is executable in real time with fixed-point arithmetic. The use of a previously-reported object tracking algorithm for the vision system to provide position feedback data is described. Test results of the control system on a breadboard rover at the Jet Propulsion Laboratory are included.

  2. An automation of physics research on base of open standards

    International Nuclear Information System (INIS)

    Smirnov, V.A.

    1997-01-01

    A wide range of problems is considered concerning an automation of Laboratory of High Energies, JINR set-ups oriented to carry out the experimental researches in high energy and relativistic nuclear physics. Electronics of discussed automation systems is performed in open standards. Main peculiarities in the creation process of automation tools for experimental set-ups, stands and accelerators are shown. Some possibilities to build some accelerator control subsystems on base of industrial automation methods and techniques are discussed

  3. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  4. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  5. An ontology for automated scenario-based training

    NARCIS (Netherlands)

    Peeters, M.M.M.; Bosch, K. van den; Neerincx, M.A.; Meyer, J.J.Ch.

    2014-01-01

    An intelligent system for automated scenario-based training (SBT) needs knowledge about the training domain, events taking place in the simulated environment, the behaviour of the participating characters, and teaching strategies for effective learning. This knowledge base should be theoretically

  6. Automated synthesis of photovoltaic-quality colloidal quantum dots using separate nucleation and growth stages

    KAUST Repository

    Pan, Jun; El-Ballouli, AlA'A O.; Rollny, Lisa R.; Voznyy, Oleksandr; Burlakov, Victor M.; Goriely, Alain; Sargent, E. H.; Bakr, Osman

    2013-01-01

    As colloidal quantum dot (CQD) optoelectronic devices continue to improve, interest grows in the scaled-up and automated synthesis of high-quality materials. Unfortunately, all reports of record-performance CQD photovoltaics have been based on small

  7. MetaSensing's FastGBSAR: ground based radar for deformation monitoring

    Science.gov (United States)

    Rödelsperger, Sabine; Meta, Adriano

    2014-10-01

    The continuous monitoring of ground deformation and structural movement has become an important task in engineering. MetaSensing introduces a novel sensor system, the Fast Ground Based Synthetic Aperture Radar (FastGBSAR), based on innovative technologies that have already been successfully applied to airborne SAR applications. The FastGBSAR allows the remote sensing of deformations of a slope or infrastructure from up to a distance of 4 km. The FastGBSAR can be setup in two different configurations: in Real Aperture Radar (RAR) mode it is capable of accurately measuring displacements along a linear range profile, ideal for monitoring vibrations of structures like bridges and towers (displacement accuracy up to 0.01 mm). Modal parameters can be determined within half an hour. Alternatively, in Synthetic Aperture Radar (SAR) configuration it produces two-dimensional displacement images with an acquisition time of less than 5 seconds, ideal for monitoring areal structures like dams, landslides and open pit mines (displacement accuracy up to 0.1 mm). The MetaSensing FastGBSAR is the first ground based SAR instrument on the market able to produce two-dimensional deformation maps with this high acquisition rate. By that, deformation time series with a high temporal and spatial resolution can be generated, giving detailed information useful to determine the deformation mechanisms involved and eventually to predict an incoming failure. The system is fully portable and can be quickly installed on bedrock or a basement. The data acquisition and processing can be fully automated leading to a low effort in instrument operation and maintenance. Due to the short acquisition time of FastGBSAR, the coherence between two acquisitions is very high and the phase unwrapping is simplified enormously. This yields a high density of resolution cells with good quality and high reliability of the acquired deformations. The deformation maps can directly be used as input into an Early

  8. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  9. 6C polarization analysis - seismic direction finding in coherent noise, automated event identification, and wavefield separation

    Science.gov (United States)

    Schmelzbach, C.; Sollberger, D.; Greenhalgh, S.; Van Renterghem, C.; Robertsson, J. O. A.

    2017-12-01

    velocities of multiple, interfering arrivals in one time window. We demonstrate how this property can be exploited to separate the wavefield into its elastic wave-modes and to isolate or suppress waves arriving from specific directions (directional filtering), both in a fully automated fashion.

  10. Mechanisms of time-based figure-ground segregation.

    Science.gov (United States)

    Kandil, Farid I; Fahle, Manfred

    2003-11-01

    Figure-ground segregation can rely on purely temporal information, that is, on short temporal delays between positional changes of elements in figure and ground (Kandil, F.I. & Fahle, M. (2001) Eur. J. Neurosci., 13, 2004-2008). Here, we investigate the underlying mechanisms by measuring temporal segregation thresholds for various kinds of motion cues. Segregation can rely on monocular first-order motion (based on luminance modulation) and second-order motion cues (contrast modulation) with a high temporal resolution of approximately 20 ms. The mechanism can also use isoluminant motion with a reduced temporal resolution of 60 ms. Figure-ground segregation can be achieved even at presentation frequencies too high for human subjects to inspect successive frames individually. In contrast, when stimuli are presented dichoptically, i.e. separately to both eyes, subjects are unable to perceive any segregation, irrespective of temporal frequency. We propose that segregation in these displays is detected by a mechanism consisting of at least two stages. On the first level, standard motion or flicker detectors signal local positional changes (flips). On the second level, a segregation mechanism combines the local activities of the low-level detectors with high temporal precision. Our findings suggest that the segregation mechanism can rely on monocular detectors but not on binocular mechanisms. Moreover, the results oppose the idea that segregation in these displays is achieved by motion detectors of a higher order (motion-from-motion), but favour mechanisms sensitive to short temporal delays even without activation of higher-order motion detectors.

  11. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  12. Automated Detection of Small Bodies by Space Based Observation

    Science.gov (United States)

    Bidstrup, P. R.; Grillmayer, G.; Andersen, A. C.; Haack, H.; Jorgensen, J. L.

    The number of known comets and asteroids is increasing every year. Up till now this number is including approximately 250,000 of the largest minor planets, as they are usually referred. These discoveries are due to the Earth-based observation which has intensified over the previous decades. Additionally larger telescopes and arrays of telescopes are being used for exploring our Solar System. It is believed that all near- Earth and Main-Belt asteroids of diameters above 10 to 30 km have been discovered, leaving these groups of objects as observationally complete. However, the cataloguing of smaller bodies is incomplete as only a very small fraction of the expected number has been discovered. It is estimated that approximately 1010 main belt asteroids in the size range 1 m to 1 km are too faint to be observed using Earth-based telescopes. In order to observe these small bodies, space-based search must be initiated to remove atmospheric disturbances and to minimize the distance to the asteroids and thereby minimising the requirement for long camera integration times. A new method of space-based detection of moving non-stellar objects is currently being developed utilising the Advanced Stellar Compass (ASC) built for spacecraft attitude determination by Ørsted, Danish Technical University. The ASC serves as a backbone technology in the project as it is capable of fully automated distinction of known and unknown celestial objects. By only processing objects of particular interest, i.e. moving objects, it will be possible to discover small bodies with a minimum of ground control, with the ultimate ambition of a fully automated space search probe. Currently, the ASC is being mounted on the Flying Laptop satellite of the Institute of Space Systems, Universität Stuttgart. It will, after a launch into a low Earth polar orbit in 2008, test the detection method with the ASC equipment that already had significant in-flight experience. A future use of the ASC based automated

  13. Genetic algorithm based separation cascade optimization

    International Nuclear Information System (INIS)

    Mahendra, A.K.; Sanyal, A.; Gouthaman, G.; Bera, T.K.

    2008-01-01

    The conventional separation cascade design procedure does not give an optimum design because of squaring-off, variation of flow rates and separation factor of the element with respect to stage location. Multi-component isotope separation further complicates the design procedure. Cascade design can be stated as a constrained multi-objective optimization. Cascade's expectation from the separating element is multi-objective i.e. overall separation factor, cut, optimum feed and separative power. Decision maker may aspire for more comprehensive multi-objective goals where optimization of cascade is coupled with the exploration of separating element optimization vector space. In real life there are many issues which make it important to understand the decision maker's perception of cost-quality-speed trade-off and consistency of preferences. Genetic algorithm (GA) is one such evolutionary technique that can be used for cascade design optimization. This paper addresses various issues involved in the GA based multi-objective optimization of the separation cascade. Reference point based optimization methodology with GA based Pareto optimality concept for separation cascade was found pragmatic and promising. This method should be explored, tested, examined and further developed for binary as well as multi-component separations. (author)

  14. Reviewing Automated Sensor-Based Visitor Tracking Studies

    DEFF Research Database (Denmark)

    Mygind, Lærke; Bentsen, Peter

    2017-01-01

    The method of timing and tracking has a long history within visitor studies and exhibition evaluation. With an increase in indoor tracking research, sensor-based positioning tool usage in museums has grown, as have expectations regarding the efficacy of technological sensing systems. This literat......The method of timing and tracking has a long history within visitor studies and exhibition evaluation. With an increase in indoor tracking research, sensor-based positioning tool usage in museums has grown, as have expectations regarding the efficacy of technological sensing systems...... methods in terms of obtained level of detail, accuracy, level of obtrusiveness, automation of data entry, ability to time concurrent behaviors, and amount of observer training needed. Although individual sensor-based and traditional, observational methods had both strengths and weaknesses, all sensor......-based timing and tracking methods provided automated data entry and the opportunity to track a number of visitors simultaneously regardless of the available personnel....

  15. Comparing and combining terrestrial laser scanning with ground-and UAV-based imaging for national-level assessment of soil erosion

    Science.gov (United States)

    McShane, Gareth; James, Mike R.; Quinton, John; Anderson, Karen; DeBell, Leon; Evans, Martin; Farrow, Luke; Glendell, Miriam; Jones, Lee; Kirkham, Matthew; Lark, Murray; Rawlins, Barry; Rickson, Jane; Quine, Tim; Wetherelt, Andy; Brazier, Richard

    2014-05-01

    3D topographic or surface models are increasingly being utilised for a wide range of applications and are established tools in geomorphological research. In this pilot study 'a cost effective framework for monitoring soil erosion in England and Wales', funded by the UK Department for Environment, Food and Rural Affairs (Defra), we compare methods of collecting topographic measurements via remote sensing for detailed studies of dynamic processes such as erosion and mass movement. The techniques assessed are terrestrial laser scanning (TLS), and unmanned aerial vehicle (UAV) photography and ground-based photography, processed using structure-from-motion (SfM) 3D reconstruction software. The methods will be applied in regions of different land use, including arable and horticultural, upland and semi natural habitats, and grassland, to quantify visible erosion pathways at the site scale. Volumetric estimates of soil loss will be quantified using the digital surface models (DSMs) provided by each technique and a modelled pre-erosion surface. Visible erosion and severity will be independently established through each technique, with their results compared and combined effectiveness assessed. A fixed delta-wing UAV (QuestUAV, http://www.questuav.com/) captures photos from a range of altitudes and angles over the study area, with automated SfM-based processing enabling rapid orthophoto production to support ground-based data acquisition. At sites with suitable scale erosion features, UAV data will also provide a DSM for volume loss measurement. Terrestrial laser scanning will provide detailed, accurate, high density measurements of the ground surface over long (100s m) distances. Ground-based photography is anticipated to be most useful for characterising small and difficult to view features. By using a consumer-grade digital camera and an SfM-based approach (using Agisoft Photoscan version 1.0.0, http://www.agisoft.ru/products/photoscan/), less expertise and fewer control

  16. The Harvard Automated Phone Task: new performance-based activities of daily living tests for early Alzheimer's disease.

    Science.gov (United States)

    Marshall, Gad A; Dekhtyar, Maria; Bruno, Jonathan M; Jethwani, Kamal; Amariglio, Rebecca E; Johnson, Keith A; Sperling, Reisa A; Rentz, Dorene M

    2015-12-01

    Impairment in activities of daily living is a major burden for Alzheimer's disease dementia patients and caregivers. Multiple subjective scales and a few performance-based instruments have been validated and proven to be reliable in measuring instrumental activities of daily living in Alzheimer's disease dementia but less so in amnestic mild cognitive impairment and preclinical Alzheimer's disease. To validate the Harvard Automated Phone Task, a new performance-based activities of daily living test for early Alzheimer's disease, which assesses high level tasks that challenge seniors in daily life. In a cross-sectional study, the Harvard Automated Phone Task was associated with demographics and cognitive measures through univariate and multivariate analyses; ability to discriminate across diagnostic groups was assessed; test-retest reliability with the same and alternate versions was assessed in a subset of participants; and the relationship with regional cortical thickness was assessed in a subset of participants. Academic clinical research center. One hundred and eighty two participants were recruited from the community (127 clinically normal elderly and 45 young normal participants) and memory disorders clinics at Brigham and Women's Hospital and Massachusetts General Hospital (10 participants with mild cognitive impairment). As part of the Harvard Automated Phone Task, participants navigated an interactive voice response system to refill a prescription (APT-Script), select a new primary care physician (APT-PCP), and make a bank account transfer and payment (APT-Bank). The 3 tasks were scored based on time, errors, and repetitions from which composite z-scores were derived, as well as a separate report of correct completion of the task. We found that the Harvard Automated Phone Task discriminated well between diagnostic groups (APT-Script: p=0.002; APT-PCP: pHarvard Automated Phone Task and executive function (APT-PCP: pHarvard Automated Phone Task, which

  17. A sensor-based automation system for handling nuclear materials

    International Nuclear Information System (INIS)

    Drotning, W.; Kimberly, H.; Wapman, W.; Darras, D.

    1997-01-01

    An automated system is being developed for handling large payloads of radioactive nuclear materials in an analytical laboratory. The automation system performs unpacking and repacking of payloads from shipping and storage containers, and delivery of the payloads to the stations in the laboratory. The system uses machine vision and force/torque sensing to provide sensor-based control of the automation system in order to enhance system safety, flexibility, and robustness, and achieve easy remote operation. The automation system also controls the operation of the laboratory measurement systems and the coordination of them with the robotic system. Particular attention has been given to system design features and analytical methods that provide an enhanced level of operational safety. Independent mechanical gripper interlock and tool release mechanisms were designed to prevent payload mishandling. An extensive Failure Modes and Effects Analysis of the automation system was developed as a safety design analysis tool

  18. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    Science.gov (United States)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.

  19. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... are calculated in a consistent way. Here we present a set of automated asterosesimic analysis tools. The main engine of these set of tools is an algorithm for modelling the autocovariance spectra of the stellar acoustic spectra allowing us to measure not only the frequency of maximum power and the large......, radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity...

  20. The evolution of automation and robotics in manned spaceflight

    Science.gov (United States)

    Moser, T. L.; Erickson, J. D.

    1986-01-01

    The evolution of automation on all manned spacecraft including the Space Shuttle is reviewed, and a concept for increasing automation and robotics from the current Shuttle Remote Manipulator System (RMS) to an autonomous system is presented. The requirements for robotic elements are identified for various functions on the Space Station, including extravehicular functions and functions within laboratory and habitation modules which expand man's capacity in space and allow selected teleoperation from the ground. The initial Space Station will employ a telerobot and necessary knowledge based systems as an advisory to the crew on monitoring, fault diagnosis, and short term planning and scheduling.

  1. Earthquake response analysis of embedded reactor building considering soil-structure separation and nonlinearity of soil

    International Nuclear Information System (INIS)

    Ichikawa, T.; Hayashi, Y.; Nakai, S.

    1987-01-01

    In the earthquake response analysis for a rigid and massive structure as a nuclear reactor building, it is important to estimate the effect of soil-structure interaction (SSI) appropriately. In case of strong earthquakes, the nonlinearity, such as the wall-ground separation, the base mat uplift of sliding, makes the behavior of the soil-structure system complex. But, if the nuclear reactor building is embedded in a relatively soft ground with surface layer, the wall-ground separation plays the most important role in the response of soil-structure system. Because, it is expected that the base uplift and slide would be less significant due to the effect of the embedment, and the wall-ground friction is usually neglected in design. But, the nonlinearity of ground may have some effect on the wall-ground separation and the response of the structure. These problems have been studied by use of FEM. Others used joint elements between the ground and the structure which does not resist tensile force. Others studied the effect of wall-ground separation with non-tension springs. But the relationship between the ground condition and the effect of the separation has not been clarified yet. To clarify the effect the analyses by FE model and lumped mass model (sway-rocking model) are performed and compared. The key parameter is the ground profile, namely the stiffness of the side soil

  2. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  3. Innovation in metrology: fast automated radiochemical separation and measurement for strontium 89 and 90

    Energy Technology Data Exchange (ETDEWEB)

    Augeray, C.; Galliez, K.; Mouton, M.; Tarlette, L.; Loyen, J.; Fayolle, C.; Gleizes, M. [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)

    2014-07-01

    Measuring radioactivity in the food and for radiological monitoring of the environment around Nuclear Facilities or mining sites requires the quantification of the radioactive isotopes present in the different compartments (liquids or solids), especially of the beta emitters. Strontium 89 and 90, both pure beta emitters are radioactive isotopes of interest. Because of their toxicity and the similarity of their chemical and physical behavior with calcium, these elements may be found through the food chain. After the Fukushima accident, the necessity of quantifying quickly radioactive isotopes such as strontium 89 and 90 appeared. The technique we are going to present concerns the determination of the activity concentration of strontium 89 and 90 in water, according to the {sup 89}Sr/{sup 90}Sr ratio. It consists of two stages: the chemical separation by ionic chromatography and the measurement of the activity concentration of strontium 89 and 90 by Cerenkov Effect. The automated separation has been developed and allows isolating the isotopes of strontium in particular the radioactive ones: strontium 89 and 90. The separation can be done within one hour. It was realized from the adaptation of existing analytical chemistry equipments with on-line couplings. The protocol of separation is based on the use of ions exchange columns of Ionic chromatography not as a separation and measurement technique of the cation but only as a separation technique. At the release time of the ion to be quantified, a fraction collector allows its recovery. The test portion is then analyzed with a liquid scintillation counter (LSC). The activity concentration is measured by Cerenkov Effect on a quenched sample. The quenching is realized by applying a thin colored film on the sample vial. This color quench is used to make strontium 90 counts disappear on the LS spectrum. This way, only yttrium 90 ingrowth and strontium 89 decay are measured (E{sup 90}Sr < E{sup 89}Sr < E{sup 90}Y

  4. Vibration-based Energy Harvesting Systems Characterization Using Automated Electronic Equipment

    Directory of Open Access Journals (Sweden)

    Ioannis KOSMADAKIS

    2015-04-01

    Full Text Available A measurement bench has been developed to fully automate the procedure for the characterization of a vibration-based energy scavenging system. The measurement system is capable of monitoring all important characteristics of a vibration harvesting system (input and output voltage, current, and other parameters, frequency and acceleration values, etc.. It is composed of a PC, typical digital measuring instruments (oscilloscope, waveform generator, etc., certain sensors and actuators, along with a microcontroller based automation module. The automation of the procedure and the manipulation of the acquired data are performed by LabVIEW software. Typical measurements of a system consisting of a vibrating source, a vibration transducer and an active rectifier are presented.

  5. Ontology-Based Device Descriptions and Device Repository for Building Automation Devices

    Directory of Open Access Journals (Sweden)

    Dibowski Henrik

    2011-01-01

    Full Text Available Device descriptions play an important role in the design and commissioning of modern building automation systems and help reducing the design time and costs. However, all established device descriptions are specialized for certain purposes and suffer from several weaknesses. This hinders a further design automation, which is strongly needed for the more and more complex building automation systems. To overcome these problems, this paper presents novel Ontology-based Device Descriptions (ODDs along with a layered ontology architecture, a specific ontology view approach with virtual properties, a generic access interface, a triple store-based database backend, and a generic search mask GUI with underlying query generation algorithm. It enables a formal, unified, and extensible specification of building automation devices, ensures their comparability, and facilitates a computer-enabled retrieval, selection, and interoperability evaluation, which is essential for an automated design. The scalability of the approach to several ten thousand devices is demonstrated.

  6. New Developments in Membrane-Based Chemical Separations

    National Research Council Canada - National Science Library

    Jirage, Kshama

    1998-01-01

    Membrane based chemical separations is an emerging field of research. This is because membrane-based separations are potentially less energy intensive and more cost effective than competing separation methods...

  7. The Harvard Automated Phone Task: new performance-based activities of daily living tests for early Alzheimer’s disease

    Science.gov (United States)

    Marshall, Gad A.; Dekhtyar, Maria; Bruno, Jonathan M.; Jethwani, Kamal; Amariglio, Rebecca E.; Johnson, Keith A.; Sperling, Reisa A.; Rentz, Dorene M.

    2015-01-01

    Background Impairment in activities of daily living is a major burden for Alzheimer’s disease dementia patients and caregivers. Multiple subjective scales and a few performance-based instruments have been validated and proven to be reliable in measuring instrumental activities of daily living in Alzheimer’s disease dementia but less so in amnestic mild cognitive impairment and preclinical Alzheimer’s disease. Objective To validate the Harvard Automated Phone Task, a new performance-based activities of daily living test for early Alzheimer’s disease, which assesses high level tasks that challenge seniors in daily life. Design In a cross-sectional study, the Harvard Automated Phone Task was associated with demographics and cognitive measures through univariate and multivariate analyses; ability to discriminate across diagnostic groups was assessed; test-retest reliability with the same and alternate versions was assessed in a subset of participants; and the relationship with regional cortical thickness was assessed in a subset of participants. Setting Academic clinical research center. Participants One hundred and eighty two participants were recruited from the community (127 clinically normal elderly and 45 young normal participants) and memory disorders clinics at Brigham and Women’s Hospital and Massachusetts General Hospital (10 participants with mild cognitive impairment). Measurements As part of the Harvard Automated Phone Task, participants navigated an interactive voice response system to refill a prescription (APT-Script), select a new primary care physician (APT-PCP), and make a bank account transfer and payment (APT-Bank). The 3 tasks were scored based on time, errors, and repetitions from which composite z-scores were derived, as well as a separate report of correct completion of the task. Results We found that the Harvard Automated Phone Task discriminated well between diagnostic groups (APT-Script: p=0.002; APT-PCP: pHarvard Automated Phone

  8. Home Automation System Based on Intelligent Transducer Enablers

    Science.gov (United States)

    Suárez-Albela, Manuel; Fraga-Lamas, Paula; Fernández-Caramés, Tiago M.; Dapena, Adriana; González-López, Miguel

    2016-01-01

    This paper presents a novel home automation system named HASITE (Home Automation System based on Intelligent Transducer Enablers), which has been specifically designed to identify and configure transducers easily and quickly. These features are especially useful in situations where many transducers are deployed, since their setup becomes a cumbersome task that consumes a significant amount of time and human resources. HASITE simplifies the deployment of a home automation system by using wireless networks and both self-configuration and self-registration protocols. Thanks to the application of these three elements, HASITE is able to add new transducers by just powering them up. According to the tests performed in different realistic scenarios, a transducer is ready to be used in less than 13 s. Moreover, all HASITE functionalities can be accessed through an API, which also allows for the integration of third-party systems. As an example, an Android application based on the API is presented. Remote users can use it to interact with transducers by just using a regular smartphone or a tablet. PMID:27690031

  9. Home Automation System Based on Intelligent Transducer Enablers.

    Science.gov (United States)

    Suárez-Albela, Manuel; Fraga-Lamas, Paula; Fernández-Caramés, Tiago M; Dapena, Adriana; González-López, Miguel

    2016-09-28

    This paper presents a novel home automation system named HASITE (Home Automation System based on Intelligent Transducer Enablers), which has been specifically designed to identify and configure transducers easily and quickly. These features are especially useful in situations where many transducers are deployed, since their setup becomes a cumbersome task that consumes a significant amount of time and human resources. HASITE simplifies the deployment of a home automation system by using wireless networks and both self-configuration and self-registration protocols. Thanks to the application of these three elements, HASITE is able to add new transducers by just powering them up. According to the tests performed in different realistic scenarios, a transducer is ready to be used in less than 13 s. Moreover, all HASITE functionalities can be accessed through an API, which also allows for the integration of third-party systems. As an example, an Android application based on the API is presented. Remote users can use it to interact with transducers by just using a regular smartphone or a tablet.

  10. Home Automation System Based on Intelligent Transducer Enablers

    Directory of Open Access Journals (Sweden)

    Manuel Suárez-Albela

    2016-09-01

    Full Text Available This paper presents a novel home automation system named HASITE (Home Automation System based on Intelligent Transducer Enablers, which has been specifically designed to identify and configure transducers easily and quickly. These features are especially useful in situations where many transducers are deployed, since their setup becomes a cumbersome task that consumes a significant amount of time and human resources. HASITE simplifies the deployment of a home automation system by using wireless networks and both self-configuration and self-registration protocols. Thanks to the application of these three elements, HASITE is able to add new transducers by just powering them up. According to the tests performed in different realistic scenarios, a transducer is ready to be used in less than 13 s. Moreover, all HASITE functionalities can be accessed through an API, which also allows for the integration of third-party systems. As an example, an Android application based on the API is presented. Remote users can use it to interact with transducers by just using a regular smartphone or a tablet.

  11. NASA space station automation: AI-based technology review

    Science.gov (United States)

    Firschein, O.; Georgeff, M. P.; Park, W.; Neumann, P.; Kautz, W. H.; Levitt, K. N.; Rom, R. J.; Poggio, A. A.

    1985-01-01

    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures.

  12. Partial Automated Alignment and Integration System

    Science.gov (United States)

    Kelley, Gary Wayne (Inventor)

    2014-01-01

    The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.

  13. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  14. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  15. First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    Science.gov (United States)

    Griffin, Sandy (Editor)

    1987-01-01

    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.

  16. Semi-automated separation of the epimeric dehydropyrrolizidine alkaloids lycopsamine and intermedine: preparation of their N-oxides and NMR comparison with diastereoisomeric rinderine and echinatine.

    Science.gov (United States)

    Colegate, Steven M; Gardner, Dale R; Betz, Joseph M; Panter, Kip E

    2014-01-01

    The diversity of structure and, particularly, stereochemical variation of the dehydropyrrolizidine alkaloids can present challenges for analysis and the isolation of pure compounds for the preparation of analytical standards and for toxicology studies. To investigate methods for the separation of gram-scale quantities of the epimeric dehydropyrrolizidine alkaloids lycopsamine and intermedine and to compare their NMR spectroscopic data with those of their heliotridine-based analogues echinatine and rinderine. Lycopsamine and intermedine were extracted, predominantly as their N-oxides and along with their acetylated derivatives, from commercial samples of comfrey (Symphytum officinale) root. Alkaloid enrichment involved liquid-liquid partitioning of the crude methanol extract between dilute aqueous acid and n-butanol, reduction of N-oxides and subsequent continuous liquid-liquid extraction of free base alkaloids into CHCl3 . The alkaloid-rich fraction was further subjected to semi-automated flash chromatography using boronated soda glass beads or boronated quartz sand. Boronated soda glass beads (or quartz sand) chromatography adapted to a Biotage Isolera Flash Chromatography System enabled large-scale separation (at least up to 1-2 g quantities) of lycopsamine and intermedine. The structures were confirmed using one- and two-dimensional (1) H- and (13) C-NMR spectroscopy. Examination of the NMR data for lycopsamine, intermedine and their heliotridine-based analogues echinatine and rinderine allowed for some amendments of literature data and provided useful comparisons for determining relative configurations in monoester dehydropyrrolizidine alkaloids. A similar NMR comparison of lycopsamine and intermedine with their N-oxides showed the effects of N-oxidation on some key chemical shifts. A levorotatory shift in specific rotation from +3.29° to -1.5° was observed for lycopsamine when dissolved in ethanol or methanol respectively. A semi-automated flash

  17. USB port compatible virtual instrument based automation for x-ray diffractometer setup

    International Nuclear Information System (INIS)

    Jayapandian, J.; Sheela, O.K.; Mallika, R.; Thiruarul, A.; Purniah, B.

    2004-01-01

    Windows based virtual instrument (VI) programs in graphic language simplify the design automation in R and D laboratories. With minimal hardware and maximum support of software, the automation becomes easier and user friendly. A novel design approach for the automation of SIEMENS make x-ray diffractometer setup is described in this paper. The automation is achieved with an indigenously developed virtual instrument program in labVIEW ver.6.0 and with a simple hardware design using 89C2051 micro-controller compatible with PC's USB port for the total automation of the experiment. (author)

  18. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  19. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  20. Fully automated chest wall line segmentation in breast MRI by using context information

    Science.gov (United States)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina

    2012-03-01

    Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).

  1. [Quality of buffy-coat-derived platelet concentrates prepared using automated system terumo automated centrifuge and separator integration (TACSI)].

    Science.gov (United States)

    Zebrowska, Agnieszka; Lipska, Alina; Rogowska, Anna; Bujno, Magdalena; Nedzi, Marta; Radziwon, Piotr

    2011-03-01

    Platelet recovery, and viability, and function is strongly dependent on the method of the preparation of platelet concentrate (PC). The glucose consumption, decrease of pH, release of alpha granules during storage in platelet concentrate impair their clinical effectiveness. To compare of the quality of buffy-coat-derieved platelet concentrates prepared using automatic system terumo automated centrifuge and separator integration (TACSI) and stored over 7 days. PCs were prepared from buffy coats using manual method (group I), or automatic system TACSI (group II). Fifteen PCs prepared from the 5 buffy coats each were stored over 7 days in 22-24 degrees C and tested. Samples were taken from the PCs container on days 1 and 7. The following laboratory tests were performed: number of platelets, platelets derived microparticles, CD62P expression, platelet adhesion, pH, glucose, lactate dehydrogenase activity. We have observed higher expression of CD62P in PCs prepared using manual method compared to the PCs produced automatically Platelet recovery was significantly higher in PCs prepared using automatic systems compare to manual method. Compared to manual methods, automatic system for preparation of buffy coats, is more efficient and enable production of platelets concentrates of higher quality.

  2. Track filter on the basis of a cellular automation

    International Nuclear Information System (INIS)

    Glazov, A.A.; Kisel', I.V.; Konotopskaya, E.V.; Ososkov, G.A.

    1991-01-01

    The filtering method for tracks in discrete detectors based on the cellular automation is described. Results of the application of this method to experimental data (the spectrometer ARES) are quite successful: threefold reduction of input information with data grouping according to their belonging to separate tracks. They lift up percentage of useful events, which simplifies and accelerates considerably their next recognition. The described cellular automation for track filtering can be successfully applied in parallel computers and also in on-line mode if hardware implementation is used. 21 refs.; 11 figs

  3. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  4. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  5. Performance Analysis on Transfer Platforms in Frame Bridge Based Automated Container Terminals

    Directory of Open Access Journals (Sweden)

    Hongtao Hu

    2013-01-01

    Full Text Available This paper studies a new automated container terminal (ACT system which utilizes multistory frame bridges and rail-mounted trolleys to transport containers between the quay and the yard. Beside typical ACT systems use trucks or automated guided vehicles for transporting containers between quay cranes and yard cranes, the new design uses three types of handling machines, namely, ground trolleys (GTs, transfer platforms (TPs, and frame trolleys (FTs. These three types of handling machines collaborate with one another to transport containers. This study decomposes the system into several subsystems. Each subsystem has one TP and several FTs and GTs dedicated to this TP. Then, a Markov chain model is developed to analyze the throughput of TPs. At last, the performance of the new ACT system is estimated. Sensitivity analyzes the numbers, and the processing rates of trolleys are conducted through the numeric experiments.

  6. Automated Air Traffic Control Operations with Weather and Time-Constraints: A First Look at (Simulated) Far-Term Control Room Operations

    Science.gov (United States)

    Prevot, Thomas; Homola, Jeffrey R.; Martin, Lynne H.; Mercer, Joey S.; Cabrall, Christopher C.

    2011-01-01

    In this paper we discuss results from a recent high fidelity simulation of air traffic control operations with automated separation assurance in the presence of weather and time-constraints. We report findings from a human-in-the-loop study conducted in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. During four afternoons in early 2010, fifteen active and recently retired air traffic controllers and supervisors controlled high levels of traffic in a highly automated environment during three-hour long scenarios, For each scenario, twelve air traffic controllers operated eight sector positions in two air traffic control areas and were supervised by three front line managers, Controllers worked one-hour shifts, were relieved by other controllers, took a 3D-minute break, and worked another one-hour shift. On average, twice today's traffic density was simulated with more than 2200 aircraft per traffic scenario. The scenarios were designed to create peaks and valleys in traffic density, growing and decaying convective weather areas, and expose controllers to heavy and light metering conditions. This design enabled an initial look at a broad spectrum of workload, challenge, boredom, and fatigue in an otherwise uncharted territory of future operations. In this paper we report human/system integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. We conclude that, with further refinements. air traffic control operations with ground-based automated separation assurance can be an effective and acceptable means to routinely provide very high traffic throughput in the en route airspace.

  7. Wavelet-based ground vehicle recognition using acoustic signals

    Science.gov (United States)

    Choe, Howard C.; Karlsen, Robert E.; Gerhart, Grant R.; Meitzler, Thomas J.

    1996-03-01

    We present, in this paper, a wavelet-based acoustic signal analysis to remotely recognize military vehicles using their sound intercepted by acoustic sensors. Since expedited signal recognition is imperative in many military and industrial situations, we developed an algorithm that provides an automated, fast signal recognition once implemented in a real-time hardware system. This algorithm consists of wavelet preprocessing, feature extraction and compact signal representation, and a simple but effective statistical pattern matching. The current status of the algorithm does not require any training. The training is replaced by human selection of reference signals (e.g., squeak or engine exhaust sound) distinctive to each individual vehicle based on human perception. This allows a fast archiving of any new vehicle type in the database once the signal is collected. The wavelet preprocessing provides time-frequency multiresolution analysis using discrete wavelet transform (DWT). Within each resolution level, feature vectors are generated from statistical parameters and energy content of the wavelet coefficients. After applying our algorithm on the intercepted acoustic signals, the resultant feature vectors are compared with the reference vehicle feature vectors in the database using statistical pattern matching to determine the type of vehicle from where the signal originated. Certainly, statistical pattern matching can be replaced by an artificial neural network (ANN); however, the ANN would require training data sets and time to train the net. Unfortunately, this is not always possible for many real world situations, especially collecting data sets from unfriendly ground vehicles to train the ANN. Our methodology using wavelet preprocessing and statistical pattern matching provides robust acoustic signal recognition. We also present an example of vehicle recognition using acoustic signals collected from two different military ground vehicles. In this paper, we will

  8. Transforming Our SMEX Organization by Way of Innovation, Standardization, and Automation

    Science.gov (United States)

    Madden, Maureen; Crouse, Pat; Carry, Everett; Esposito, timothy; Parker, Jeffrey; Bradley, David

    2006-01-01

    NASA's Small Explorer (SMEX) Flight Operations Team (FOT) is currently tackling the challenge of supporting ground operations for several satellites that have surpassed their designed lifetime and have a dwindling budget. At Goddard Space Flight Center (GSFC), these missions are presently being reengineered into a fleet-oriented ground system. When complete, this ground system will provide command and control of four SMEX missions, and will demonstrate fleet automation and control concepts as a pathfinder for additional mission integrations. A goal of this reengineering effort is to demonstrate new ground-system technologies that show promise of supporting longer mission lifecycles and simplifying component integration. In pursuit of this goal, the SMEX organization has had to examine standardization, innovation, and automation. A core technology being demonstrated in this effort is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture focuses on providing standard interfaces for ground system applications to promote application interoperability. Building around commercial Message Oriented Middleware and providing a common messaging standard allows GMSEC to provide the capabilities necessary to support integration of new software components into existing missions and increase the level of interaction within the system. For SMS, GMSEC has become the technology platform to transform flight operations with the innovation and automation necessary to reduce operational costs. The automation technologies supported in SMEX are built upon capabilities provided by the GMSEC architecture that allows the FOT to further reduce the involvement of the console, operator. Initially, SMEX is automating only routine operations, such as safety and health monitoring, basic commanding, and system recovery. The operational concepts being developed here will reduce the need for staffed passes and are a necessity for future fleet management. As this

  9. Designing of smart home automation system based on Raspberry Pi

    Science.gov (United States)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  10. Metal–organic frameworks based membranes for liquid separation

    KAUST Repository

    Li, Xin

    2017-11-07

    Metal-organic frameworks (MOFs) represent a fascinating class of solid crystalline materials which can be self-assembled in a straightforward manner by the coordination of metal ions or clusters with organic ligands. Owing to their intrinsic porous characteristics, unique chemical versatility and abundant functionalities, MOFs have received substantial attention for diverse industrial applications, including membrane separation. Exciting research activities ranging from fabrication strategies to separation applications of MOF-based membranes have appeared. Inspired by the marvelous achievements of MOF-based membranes in gas separations, liquid separations are also being explored for the purpose of constructing continuous MOFs membranes or MOF-based mixed matrix membranes. Although these are in an emerging stage of vigorous development, most efforts are directed towards improving the liquid separation efficiency with well-designed MOF-based membranes. Therefore, as an increasing trend in membrane separation, the field of MOF-based membranes for liquid separation is highlighted in this review. The criteria for judicious selection of MOFs in fabricating MOF-based membranes are given. Special attention is paid to rational design strategies for MOF-based membranes, along with the latest application progress in the area of liquid separations, such as pervaporation, water treatment, and organic solvent nanofiltration. Moreover, some attractive dual-function applications of MOF-based membranes in the removal of micropollutants, degradation, and antibacterial activity are also reviewed. Finally, we define the remaining challenges and future opportunities in this field. This Tutorial Review provides an overview and outlook for MOF-based membranes for liquid separations. Further development of MOF-based membranes for liquid separation must consider the demands of strict separation standards and environmental safety for industrial application.

  11. Metal-organic frameworks based membranes for liquid separation.

    Science.gov (United States)

    Li, Xin; Liu, Yuxin; Wang, Jing; Gascon, Jorge; Li, Jiansheng; Van der Bruggen, Bart

    2017-11-27

    Metal-organic frameworks (MOFs) represent a fascinating class of solid crystalline materials which can be self-assembled in a straightforward manner by the coordination of metal ions or clusters with organic ligands. Owing to their intrinsic porous characteristics, unique chemical versatility and abundant functionalities, MOFs have received substantial attention for diverse industrial applications, including membrane separation. Exciting research activities ranging from fabrication strategies to separation applications of MOF-based membranes have appeared. Inspired by the marvelous achievements of MOF-based membranes in gas separations, liquid separations are also being explored for the purpose of constructing continuous MOFs membranes or MOF-based mixed matrix membranes. Although these are in an emerging stage of vigorous development, most efforts are directed towards improving the liquid separation efficiency with well-designed MOF-based membranes. Therefore, as an increasing trend in membrane separation, the field of MOF-based membranes for liquid separation is highlighted in this review. The criteria for judicious selection of MOFs in fabricating MOF-based membranes are given. Special attention is paid to rational design strategies for MOF-based membranes, along with the latest application progress in the area of liquid separations, such as pervaporation, water treatment, and organic solvent nanofiltration. Moreover, some attractive dual-function applications of MOF-based membranes in the removal of micropollutants, degradation, and antibacterial activity are also reviewed. Finally, we define the remaining challenges and future opportunities in this field. This Tutorial Review provides an overview and outlook for MOF-based membranes for liquid separations. Further development of MOF-based membranes for liquid separation must consider the demands of strict separation standards and environmental safety for industrial application.

  12. Impact of Tactical and Strategic Weather Avoidance on Separation Assurance

    Science.gov (United States)

    Refai, Mohamad S.; Windhorst, Robert

    2011-01-01

    The ability to keep flights away from weather hazards while maintaining aircraft-to-aircraft separation is critically important. The Advanced Airspace Concept is an automation concept that implements a ground-based strategic conflict resolution algorithm for management of aircraft separation. The impact of dynamic and uncertain weather avoidance on this concept is investigated. A strategic weather rerouting system is integrated with the Advanced Airspace Concept, which also provides a tactical weather avoidance algorithm, in a fast time simulation of the Air Transportation System. Strategic weather rerouting is used to plan routes around weather in the 20 minute to two-hour time horizon. To address forecast uncertainty, flight routes are revised at 15 minute intervals. Tactical weather avoidance is used for short term trajectory adjustments (30 minute planning horizon) that are updated every minute to address any weather conflicts (instances where aircraft are predicted to pass through weather cells) that are left unresolved by strategic weather rerouting. The fast time simulation is used to assess the impact of tactical weather avoidance on the performance of automated conflict resolution as well as the impact of strategic weather rerouting on both conflict resolution and tactical weather avoidance. The results demonstrate that both tactical weather avoidance and strategic weather rerouting increase the algorithm complexity required to find aircraft conflict resolutions. Results also demonstrate that tactical weather avoidance is prone to higher airborne delay than strategic weather rerouting. Adding strategic weather rerouting to tactical weather avoidance reduces total airborne delays for the reported scenario by 18% and reduces the number of remaining weather violations by 13%. Finally, two features are identified that have proven important for strategic weather rerouting to realize these benefits; namely, the ability to revise reroutes and the use of maneuvers

  13. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio]. Volume 5, Field Investigation report

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    An environmental investigation of ground water conditions has been undertaken at Wright-Patterson Air Force Base (WPAFB), Ohio to obtain data to assist in the evaluation of a potential removal action to prevent, to the extent practicable, migration of the contaminated ground water across Base boundaries. Field investigations were limited to the central section of the southwestern boundary of Area C and the Springfield Pike boundary of Area B. Further, the study was limited to a maximum depth of 150 feet below grade. Three primary activities of the field investigation were: (1) installation of 22 monitoring wells, (2) collection and analysis of ground water from 71 locations, (3) measurement of ground water elevations at 69 locations. Volatile organic compounds including trichloroethylene, perchloroethylene, and/or vinyl chloride were detected in concentrations exceeding Maximum Contaminant Levels (MCL) at three locations within the Area C investigation area. Ground water at the Springfield Pike boundary of Area B occurs in two primary units, separated by a thicker-than-expected clay layers. One well within Area B was determined to exceed the MCL for trichloroethylene.

  14. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  15. Automated model-based testing of hybrid systems

    NARCIS (Netherlands)

    Osch, van M.P.W.J.

    2009-01-01

    In automated model-based input-output conformance testing, tests are automati- cally generated from a speci¯cation and automatically executed on an implemen- tation. Input is applied to the implementation and output is observed from the implementation. If the observed output is allowed according to

  16. a Threshold-Free Filtering Algorithm for Airborne LIDAR Point Clouds Based on Expectation-Maximization

    Science.gov (United States)

    Hui, Z.; Cheng, P.; Ziggah, Y. Y.; Nie, Y.

    2018-04-01

    Filtering is a key step for most applications of airborne LiDAR point clouds. Although lots of filtering algorithms have been put forward in recent years, most of them suffer from parameters setting or thresholds adjusting, which will be time-consuming and reduce the degree of automation of the algorithm. To overcome this problem, this paper proposed a threshold-free filtering algorithm based on expectation-maximization. The proposed algorithm is developed based on an assumption that point clouds are seen as a mixture of Gaussian models. The separation of ground points and non-ground points from point clouds can be replaced as a separation of a mixed Gaussian model. Expectation-maximization (EM) is applied for realizing the separation. EM is used to calculate maximum likelihood estimates of the mixture parameters. Using the estimated parameters, the likelihoods of each point belonging to ground or object can be computed. After several iterations, point clouds can be labelled as the component with a larger likelihood. Furthermore, intensity information was also utilized to optimize the filtering results acquired using the EM method. The proposed algorithm was tested using two different datasets used in practice. Experimental results showed that the proposed method can filter non-ground points effectively. To quantitatively evaluate the proposed method, this paper adopted the dataset provided by the ISPRS for the test. The proposed algorithm can obtain a 4.48 % total error which is much lower than most of the eight classical filtering algorithms reported by the ISPRS.

  17. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  18. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  19. Evaluation of Novel Wet Chemistry Separation and Purification Methods to Facilitate Automation of Astatine-211 Isolation

    International Nuclear Information System (INIS)

    Wilbur, Daniel Scott

    2016-01-01

    % extracted; There was some indication that the PEG-Merrifield resins could be saturated (perhaps with Bi) resulting in lower capture percentages, but more studies need to be done to confirm that; A target dissolution chamber, designed and built at PNNL, works well with syringe pumps so it can be used in an automated system; Preliminary semi-automated 211 At isolation studies have been conducted with full-scale target dissolution and 211 At isolation using a PEG column on the Hamilton automated system gave low overall recoveries, but HNO 3 was used (rather than HCl) for loading the 211 At and flow rates were not optimized; Results obtained using PEG columns are high enough to warrant further development on a fully automated system; Results obtained also indicate that additional studies are warranted to evaluate other types of columns for 211 At separation from bismuth, which allow use of HNO 3 /HCl mixtures for loading and NaOH for eluting 211 At. Such a column could greatly simplify the overall isolation process and make it easier to automate.

  20. Biometric correspondence between reface computerized facial approximations and CT-derived ground truth skin surface models objectively examined using an automated facial recognition system.

    Science.gov (United States)

    Parks, Connie L; Monson, Keith L

    2018-05-01

    This study employed an automated facial recognition system as a means of objectively evaluating biometric correspondence between a ReFace facial approximation and the computed tomography (CT) derived ground truth skin surface of the same individual. High rates of biometric correspondence were observed, irrespective of rank class (R k ) or demographic cohort examined. Overall, 48% of the test subjects' ReFace approximation probes (n=96) were matched to his or her corresponding ground truth skin surface image at R 1 , a rank indicating a high degree of biometric correspondence and a potential positive identification. Identification rates improved with each successively broader rank class (R 10 =85%, R 25 =96%, and R 50 =99%), with 100% identification by R 57 . A sharp increase (39% mean increase) in identification rates was observed between R 1 and R 10 across most rank classes and demographic cohorts. In contrast, significantly lower (p0.05) performance differences were observed across demographic cohorts or CT scan protocols. Performance measures observed in this research suggest that ReFace approximations are biometrically similar to the actual faces of the approximated individuals and, therefore, may have potential operational utility in contexts in which computerized approximations are utilized as probes in automated facial recognition systems. Copyright © 2018. Published by Elsevier B.V.

  1. RISK MANAGEMENT AUTOMATION OF SOFTWARE PROJECTS BASED ОN FUZZY INFERENCE

    Directory of Open Access Journals (Sweden)

    T. M. Zubkova

    2015-09-01

    Full Text Available Application suitability for one of the intelligent methods for risk management of software projects has been shown based on the review of existing algorithms for fuzzy inference in the field of applied problems. Information sources in the management of software projects are analyzed; major and minor risks are highlighted. The most critical parameters have been singled out giving the possibility to estimate the occurrence of an adverse situations (project duration, the frequency of customer’s requirements changing, work deadlines, experience of developers’ participation in such projects and others.. The method of qualitative fuzzy description based on fuzzy logic has been developed for analysis of these parameters. Evaluation of possible situations and knowledge base formation rely on a survey of experts. The main limitations of existing automated systems have been identified in relation to their applicability to risk management in the software design. Theoretical research set the stage for software system that makes it possible to automate the risk management process for software projects. The developed software system automates the process of fuzzy inference in the following stages: rule base formation of the fuzzy inference systems, fuzzification of input variables, aggregation of sub-conditions, activation and accumulation of conclusions for fuzzy production rules, variables defuzzification. The result of risk management automation process in the software design is their quantitative and qualitative assessment and expert advice for their minimization. Practical significance of the work lies in the fact that implementation of the developed automated system gives the possibility for performance improvement of software projects.

  2. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  3. 618-11 Burial Ground USRADS radiological surveys

    International Nuclear Information System (INIS)

    Wendling, M.A.

    1994-01-01

    This report summarizes and documents the results of the radiological surveys conducted from February 4 through February 10, 1993 over the 618-11 Burial Ground, Hanford Site, Richland, Washington. In addition, this report explains the survey methodology using the Ultrasonic Ranging and Data System (USRADS). The 618-11 Burial Ground radiological survey field task consisted of two activities: characterization of the specific background conditions and the radiological survey of the area. The radiological survey of the 618-11 Burial Ground, along with the background study, were conducted by Site Investigative Surveys Environmental Restoration Health Physics Organization of the Westinghouse Hanford Company. The survey methodology was based on utilization of the Ultrasonic Ranging and Data System (USRADS) for automated recording of the gross gamma radiation levels at or near six (6) inches and at three (3) feet from the surface soil

  4. Illumination compensation in ground based hyperspectral imaging

    Science.gov (United States)

    Wendel, Alexander; Underwood, James

    2017-07-01

    Hyperspectral imaging has emerged as an important tool for analysing vegetation data in agricultural applications. Recently, low altitude and ground based hyperspectral imaging solutions have come to the fore, providing very high resolution data for mapping and studying large areas of crops in detail. However, these platforms introduce a unique set of challenges that need to be overcome to ensure consistent, accurate and timely acquisition of data. One particular problem is dealing with changes in environmental illumination while operating with natural light under cloud cover, which can have considerable effects on spectral shape. In the past this has been commonly achieved by imaging known reference targets at the time of data acquisition, direct measurement of irradiance, or atmospheric modelling. While capturing a reference panel continuously or very frequently allows accurate compensation for illumination changes, this is often not practical with ground based platforms, and impossible in aerial applications. This paper examines the use of an autonomous unmanned ground vehicle (UGV) to gather high resolution hyperspectral imaging data of crops under natural illumination. A process of illumination compensation is performed to extract the inherent reflectance properties of the crops, despite variable illumination. This work adapts a previously developed subspace model approach to reflectance and illumination recovery. Though tested on a ground vehicle in this paper, it is applicable to low altitude unmanned aerial hyperspectral imagery also. The method uses occasional observations of reference panel training data from within the same or other datasets, which enables a practical field protocol that minimises in-field manual labour. This paper tests the new approach, comparing it against traditional methods. Several illumination compensation protocols for high volume ground based data collection are presented based on the results. The findings in this paper are

  5. Automated deficiency letter data base

    International Nuclear Information System (INIS)

    Jones, R.D.

    1983-12-01

    An automated data base relevant to the various licensee deficiencies that accrue during the materials licensing application review process of the Nuclear Regulatory Commission (NRC) is described. A data base management system (DBMs) is used for data retrieval, file-tending, and examination of the interrelationships among the data types in the data base. Use of word processors to emulate computer terminals for the purpose of data base population (loading) and report generation is discussed. Also described is the technique used to link, for update purposes, the data base (accessed by means of SYSTEM 2000 on a CDC 6600 computer) to the NRC Material License Master File resident on the National Institutes of Health (NIH) IBM System 370 computer. A user's manual that provides easy-to-understand instructions for the nonprogramming user on how to generate ad hoc analytical reports to facilitate management decisions is also included

  6. Live demonstration: Screen printed, microwave based level sensor for automated drug delivery

    KAUST Repository

    Karimi, Muhammad Akram; Arsalan, Muhammad; Shamim, Atif

    2018-01-01

    Level sensors find numerous applications in many industries to automate the processes involving chemicals. Recently, some commercial ultrasound based level sensors are also being used to automate the drug delivery process [1]. Some of the most

  7. Automation of Hubble Space Telescope Mission Operations

    Science.gov (United States)

    Burley, Richard; Goulet, Gregory; Slater, Mark; Huey, William; Bassford, Lynn; Dunham, Larry

    2012-01-01

    On June 13, 2011, after more than 21 years, 115 thousand orbits, and nearly 1 million exposures taken, the operation of the Hubble Space Telescope successfully transitioned from 24x7x365 staffing to 815 staffing. This required the automation of routine mission operations including telemetry and forward link acquisition, data dumping and solid-state recorder management, stored command loading, and health and safety monitoring of both the observatory and the HST Ground System. These changes were driven by budget reductions, and required ground system and onboard spacecraft enhancements across the entire operations spectrum, from planning and scheduling systems to payload flight software. Changes in personnel and staffing were required in order to adapt to the new roles and responsibilities required in the new automated operations era. This paper will provide a high level overview of the obstacles to automating nominal HST mission operations, both technical and cultural, and how those obstacles were overcome.

  8. Automated oil spill detection with multispectral imagery

    Science.gov (United States)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  9. Designing of smart home automation system based on Raspberry Pi

    International Nuclear Information System (INIS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-01-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning “on” and “off” of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  10. Designing of smart home automation system based on Raspberry Pi

    Energy Technology Data Exchange (ETDEWEB)

    Saini, Ravi Prakash; Singh, Bhanu Pratap [B K Birla Institute of Engineering & Technology, Pilani, Rajasthan (India); Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn, E-mail: Dr.N.L@ieee.org [Thammasat University, Rangsit Campus, Pathum Thani (Thailand)

    2016-03-09

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning “on” and “off” of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  11. Wireless Android Based Home Automation System

    Directory of Open Access Journals (Sweden)

    Muhammad Tanveer Riaz

    2017-01-01

    Full Text Available This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network or remotely (internet manage and control the system. Second part is the hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of the available home automation system in the market, the proposed system is scalable that one server can manage many hardware interface modules as long as it exists within network coverage. System supports a wide range of home automation devices like appliances, power management components, and security components. The proposed system is better in terms of the flexibility and scalability than the commercially available home automation systems

  12. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  13. Airborne Tactical Intent-Based Conflict Resolution Capability

    Science.gov (United States)

    Wing, David J.; Vivona, Robert A.; Roscoe, David A.

    2009-01-01

    Trajectory-based operations with self-separation involve the aircraft taking the primary role in the management of its own trajectory in the presence of other traffic. In this role, the flight crew assumes the responsibility for ensuring that the aircraft remains separated from all other aircraft by at least a minimum separation standard. These operations are enabled by cooperative airborne surveillance and by airborne automation systems that provide essential monitoring and decision support functions for the flight crew. An airborne automation system developed and used by NASA for research investigations of required functionality is the Autonomous Operations Planner. It supports the flight crew in managing their trajectory when responsible for self-separation by providing monitoring and decision support functions for both strategic and tactical flight modes. The paper focuses on the latter of these modes by describing a capability for tactical intent-based conflict resolution and its role in a comprehensive suite of automation functions supporting trajectory-based operations with self-separation.

  14. Automated beam placement for breast radiotherapy using a support vector machine based algorithm

    International Nuclear Information System (INIS)

    Zhao Xuan; Kong, Dewen; Jozsef, Gabor; Chang, Jenghwa; Wong, Edward K.; Formenti, Silvia C.; Wang Yao

    2012-01-01

    Purpose: To develop an automated beam placement technique for whole breast radiotherapy using tangential beams. We seek to find optimal parameters for tangential beams to cover the whole ipsilateral breast (WB) and minimize the dose to the organs at risk (OARs). Methods: A support vector machine (SVM) based method is proposed to determine the optimal posterior plane of the tangential beams. Relative significances of including/avoiding the volumes of interests are incorporated into the cost function of the SVM. After finding the optimal 3-D plane that separates the whole breast (WB) and the included clinical target volumes (CTVs) from the OARs, the gantry angle, collimator angle, and posterior jaw size of the tangential beams are derived from the separating plane equation. Dosimetric measures of the treatment plans determined by the automated method are compared with those obtained by applying manual beam placement by the physicians. The method can be further extended to use multileaf collimator (MLC) blocking by optimizing posterior MLC positions. Results: The plans for 36 patients (23 prone- and 13 supine-treated) with left breast cancer were analyzed. Our algorithm reduced the volume of the heart that receives >500 cGy dose (V5) from 2.7 to 1.7 cm 3 (p = 0.058) on average and the volume of the ipsilateral lung that receives >1000 cGy dose (V10) from 55.2 to 40.7 cm 3 (p = 0.0013). The dose coverage as measured by volume receiving >95% of the prescription dose (V95%) of the WB without a 5 mm superficial layer decreases by only 0.74% (p = 0.0002) and the V95% for the tumor bed with 1.5 cm margin remains unchanged. Conclusions: This study has demonstrated the feasibility of using a SVM-based algorithm to determine optimal beam placement without a physician's intervention. The proposed method reduced the dose to OARs, especially for supine treated patients, without any relevant degradation of dose homogeneity and coverage in general.

  15. Diversity requirements for safety critical software-based automation systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1998-03-01

    System vendors nowadays propose software-based systems even for the most critical safety functions in nuclear power plants. Due to the nature and mechanisms of influence of software faults new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)' various safety assessment methods and tools for software based systems are developed and evaluated. This report first discusses the (common cause) failure mechanisms in software-based systems, then defines fault-tolerant system architectures to avoid common cause failures, then studies the various alternatives to apply diversity and their influence on system reliability. Finally, a method for the assessment of diversity is described. Other recently published reports in OHA-report series handles the statistical reliability assessment of software based (STUK-YTO-TR 119), usage models in reliability assessment of software-based systems (STUK-YTO-TR 128) and handling of programmable automation in plant PSA-studies (STUK-YTO-TR 129)

  16. Safety assessment of automated vehicle functions by simulation-based fault injection

    OpenAIRE

    Juez, Garazi; Amparan, Estibaliz; Lattarulo, Ray; Rastelli, Joshue Perez; Ruiz, Alejandra; Espinoza, Huascar

    2017-01-01

    As automated driving vehicles become more sophisticated and pervasive, it is increasingly important to assure its safety even in the presence of faults. This paper presents a simulation-based fault injection approach (Sabotage) aimed at assessing the safety of automated vehicle functions. In particular, we focus on a case study to forecast fault effects during the model-based design of a lateral control function. The goal is to determine the acceptable fault detection interval for pe...

  17. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  18. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  19. FPGA-Based Real-Time Motion Detection for Automated Video Surveillance Systems

    Directory of Open Access Journals (Sweden)

    Sanjay Singh

    2016-03-01

    Full Text Available Design of automated video surveillance systems is one of the exigent missions in computer vision community because of their ability to automatically select frames of interest in incoming video streams based on motion detection. This research paper focuses on the real-time hardware implementation of a motion detection algorithm for such vision based automated surveillance systems. A dedicated VLSI architecture has been proposed and designed for clustering-based motion detection scheme. The working prototype of a complete standalone automated video surveillance system, including input camera interface, designed motion detection VLSI architecture, and output display interface, with real-time relevant motion detection capabilities, has been implemented on Xilinx ML510 (Virtex-5 FX130T FPGA platform. The prototyped system robustly detects the relevant motion in real-time in live PAL (720 × 576 resolution video streams directly coming from the camera.

  20. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  1. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    Science.gov (United States)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  2. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  3. Separators - Technology review: Ceramic based separators for secondary batteries

    Energy Technology Data Exchange (ETDEWEB)

    Nestler, Tina; Schmid, Robert; Münchgesang, Wolfram; Bazhenov, Vasilii; Meyer, Dirk C. [Technische Universität Bergakademie Freiberg, Institut für Experimentelle Physik, Leipziger Str. 23, 09596 Freiberg (Germany); Schilm, Jochen [Fraunhofer-Institut für Keramische Technologien und Systeme IKTS, Winterbergstraße 28, 01277 Dresden (Germany); Leisegang, Tilmann [Fraunhofer-Technologiezentrum Halbleitermaterialien THM, Am St.-Niclas-Schacht 13, 09599 Freiberg (Germany)

    2014-06-16

    Besides a continuous increase of the worldwide use of electricity, the electric energy storage technology market is a growing sector. At the latest since the German energy transition ('Energiewende') was announced, technological solutions for the storage of renewable energy have been intensively studied. Storage technologies in various forms are commercially available. A widespread technology is the electrochemical cell. Here the cost per kWh, e. g. determined by energy density, production process and cycle life, is of main interest. Commonly, an electrochemical cell consists of an anode and a cathode that are separated by an ion permeable or ion conductive membrane - the separator - as one of the main components. Many applications use polymeric separators whose pores are filled with liquid electrolyte, providing high power densities. However, problems arise from different failure mechanisms during cell operation, which can affect the integrity and functionality of these separators. In the case of excessive heating or mechanical damage, the polymeric separators become an incalculable security risk. Furthermore, the growth of metallic dendrites between the electrodes leads to unwanted short circuits. In order to minimize these risks, temperature stable and non-flammable ceramic particles can be added, forming so-called composite separators. Full ceramic separators, in turn, are currently commercially used only for high-temperature operation systems, due to their comparably low ion conductivity at room temperature. However, as security and lifetime demands increase, these materials turn into focus also for future room temperature applications. Hence, growing research effort is being spent on the improvement of the ion conductivity of these ceramic solid electrolyte materials, acting as separator and electrolyte at the same time. Starting with a short overview of available separator technologies and the separator market, this review focuses on ceramic-based

  4. Separators - Technology review: Ceramic based separators for secondary batteries

    Science.gov (United States)

    Nestler, Tina; Schmid, Robert; Münchgesang, Wolfram; Bazhenov, Vasilii; Schilm, Jochen; Leisegang, Tilmann; Meyer, Dirk C.

    2014-06-01

    Besides a continuous increase of the worldwide use of electricity, the electric energy storage technology market is a growing sector. At the latest since the German energy transition ("Energiewende") was announced, technological solutions for the storage of renewable energy have been intensively studied. Storage technologies in various forms are commercially available. A widespread technology is the electrochemical cell. Here the cost per kWh, e. g. determined by energy density, production process and cycle life, is of main interest. Commonly, an electrochemical cell consists of an anode and a cathode that are separated by an ion permeable or ion conductive membrane - the separator - as one of the main components. Many applications use polymeric separators whose pores are filled with liquid electrolyte, providing high power densities. However, problems arise from different failure mechanisms during cell operation, which can affect the integrity and functionality of these separators. In the case of excessive heating or mechanical damage, the polymeric separators become an incalculable security risk. Furthermore, the growth of metallic dendrites between the electrodes leads to unwanted short circuits. In order to minimize these risks, temperature stable and non-flammable ceramic particles can be added, forming so-called composite separators. Full ceramic separators, in turn, are currently commercially used only for high-temperature operation systems, due to their comparably low ion conductivity at room temperature. However, as security and lifetime demands increase, these materials turn into focus also for future room temperature applications. Hence, growing research effort is being spent on the improvement of the ion conductivity of these ceramic solid electrolyte materials, acting as separator and electrolyte at the same time. Starting with a short overview of available separator technologies and the separator market, this review focuses on ceramic-based separators

  5. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  6. Isotope separation

    International Nuclear Information System (INIS)

    Bartlett, R.J.; Morrey, J.R.

    1978-01-01

    A method and apparatus is described for separating gas molecules containing one isotope of an element from gas molecules containing other isotopes of the same element in which all of the molecules of the gas are at the same electronic state in their ground state. Gas molecules in a gas stream containing one of the isotopes are selectively excited to a different electronic state while leaving the other gas molecules in their original ground state. Gas molecules containing one of the isotopes are then deflected from the other gas molecules in the stream and thus physically separated

  7. Neural Networks for Segregation of Multiple Objects: Visual Figure-Ground Separation and Auditory Pitch Perception.

    Science.gov (United States)

    Wyse, Lonce

    An important component of perceptual object recognition is the segmentation into coherent perceptual units of the "blooming buzzing confusion" that bombards the senses. The work presented herein develops neural network models of some key processes of pre-attentive vision and audition that serve this goal. A neural network model, called an FBF (Feature -Boundary-Feature) network, is proposed for automatic parallel separation of multiple figures from each other and their backgrounds in noisy images. Figure-ground separation is accomplished by iterating operations of a Boundary Contour System (BCS) that generates a boundary segmentation of a scene, and a Feature Contour System (FCS) that compensates for variable illumination and fills-in surface properties using boundary signals. A key new feature is the use of the FBF filling-in process for the figure-ground separation of connected regions, which are subsequently more easily recognized. The new CORT-X 2 model is a feed-forward version of the BCS that is designed to detect, regularize, and complete boundaries in up to 50 percent noise. It also exploits the complementary properties of on-cells and off -cells to generate boundary segmentations and to compensate for boundary gaps during filling-in. In the realm of audition, many sounds are dominated by energy at integer multiples, or "harmonics", of a fundamental frequency. For such sounds (e.g., vowels in speech), the individual frequency components fuse, so that they are perceived as one sound source with a pitch at the fundamental frequency. Pitch is integral to separating auditory sources, as well as to speaker identification and speech understanding. A neural network model of pitch perception called SPINET (SPatial PItch NETwork) is developed and used to simulate a broader range of perceptual data than previous spectral models. The model employs a bank of narrowband filters as a simple model of basilar membrane mechanics, spectral on-center off-surround competitive

  8. An Automated Weather Research and Forecasting (WRF)-Based Nowcasting System: Software Description

    Science.gov (United States)

    2013-10-01

    14. ABSTRACT A Web service /Web interface software package has been engineered to address the need for an automated means to run the Weather Research...An Automated Weather Research and Forecasting (WRF)- Based Nowcasting System: Software Description by Stephen F. Kirby, Brian P. Reen, and...Based Nowcasting System: Software Description Stephen F. Kirby, Brian P. Reen, and Robert E. Dumais Jr. Computational and Information Sciences

  9. Automation of route identification and optimisation based on data-mining and chemical intuition.

    Science.gov (United States)

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  10. AUTOMATION DESIGN FOR MONORAIL - BASED SYSTEM PROCESSES

    Directory of Open Access Journals (Sweden)

    Bunda BESA

    2016-12-01

    Full Text Available Currently, conventional methods of decline development put enormous cost pressure on the profitability of mining operations. This is the case with narrow vein ore bodies where current methods and mine design of decline development may be too expensive to support economic extraction of the ore. According to studies, the time it takes to drill, clean and blast an end in conventional decline development can be up to 224 minutes. This is because once an end is blasted, cleaning should first be completed before drilling can commence, resulting in low advance rates per shift. Improvements in advance rates during decline development can be achieved by application of the Electric Monorail Transport System (EMTS based drilling system. The system consists of the drilling and loading components that use monorail technology to drill and clean the face during decline development. The two systems work simultaneously at the face in such a way that as the top part of the face is being drilled the pneumatic loading system cleans the face. However, to improve the efficiency of the two systems, critical processes performed by the two systems during mining operations must be automated. Automation increases safety and productivity, reduces operator fatigue and also reduces the labour costs of the system. The aim of this paper is, therefore, to describe automation designs of the two processes performed by the monorail drilling and loading systems during operations. During automation design, critical processes performed by the two systems and control requirements necessary to allow the two systems execute such processes automatically have also been identified.

  11. Ground water '89

    International Nuclear Information System (INIS)

    1989-01-01

    The proceedings of the 5th biennial symposium of the Ground Water Division of the Geological Society of South Africa are presented. The theme of the symposium was ground water and mining. Papers were presented on the following topics: ground water resources; ground water contamination; chemical analyses of ground water and mining and its influece on ground water. Separate abstracts were prepared for 5 of the papers presented. The remaining papers were considered outside the subject scope of INIS

  12. AUTOMATED INSPECTION OF POWER LINE CORRIDORS TO MEASURE VEGETATION UNDERCUT USING UAV-BASED IMAGES

    Directory of Open Access Journals (Sweden)

    M. Maurer

    2017-08-01

    Full Text Available Power line corridor inspection is a time consuming task that is performed mostly manually. As the development of UAVs made huge progress in recent years, and photogrammetric computer vision systems became well established, it is time to further automate inspection tasks. In this paper we present an automated processing pipeline to inspect vegetation undercuts of power line corridors. For this, the area of inspection is reconstructed, geo-referenced, semantically segmented and inter class distance measurements are calculated. The presented pipeline performs an automated selection of the proper 3D reconstruction method for on the one hand wiry (power line, and on the other hand solid objects (surrounding. The automated selection is realized by performing pixel-wise semantic segmentation of the input images using a Fully Convolutional Neural Network. Due to the geo-referenced semantic 3D reconstructions a documentation of areas where maintenance work has to be performed is inherently included in the distance measurements and can be extracted easily. We evaluate the influence of the semantic segmentation according to the 3D reconstruction and show that the automated semantic separation in wiry and dense objects of the 3D reconstruction routine improves the quality of the vegetation undercut inspection. We show the generalization of the semantic segmentation to datasets acquired using different acquisition routines and to varied seasons in time.

  13. Automated Inspection of Power Line Corridors to Measure Vegetation Undercut Using Uav-Based Images

    Science.gov (United States)

    Maurer, M.; Hofer, M.; Fraundorfer, F.; Bischof, H.

    2017-08-01

    Power line corridor inspection is a time consuming task that is performed mostly manually. As the development of UAVs made huge progress in recent years, and photogrammetric computer vision systems became well established, it is time to further automate inspection tasks. In this paper we present an automated processing pipeline to inspect vegetation undercuts of power line corridors. For this, the area of inspection is reconstructed, geo-referenced, semantically segmented and inter class distance measurements are calculated. The presented pipeline performs an automated selection of the proper 3D reconstruction method for on the one hand wiry (power line), and on the other hand solid objects (surrounding). The automated selection is realized by performing pixel-wise semantic segmentation of the input images using a Fully Convolutional Neural Network. Due to the geo-referenced semantic 3D reconstructions a documentation of areas where maintenance work has to be performed is inherently included in the distance measurements and can be extracted easily. We evaluate the influence of the semantic segmentation according to the 3D reconstruction and show that the automated semantic separation in wiry and dense objects of the 3D reconstruction routine improves the quality of the vegetation undercut inspection. We show the generalization of the semantic segmentation to datasets acquired using different acquisition routines and to varied seasons in time.

  14. Experimental Research of Machineless Energy Separation Effect Influenced by Shock Waves

    Directory of Open Access Journals (Sweden)

    S. S. Popovich

    2016-01-01

    Full Text Available The paper presents experimental research results of machineless energy separation effect with transversal ribs in supersonic channel. The energy separation effect assumes a physical division of the inlet flow into two or more flows, each having different stagnation temperature. Among well-known energy separation effects noted there are Ranque-Hilsch vortex tubes, Hartmann-Sprenger resonance tubes, pulsating tubes and some others.A working principle of device under study is based on thermal interaction between subsonic and supersonic gas flows through a heat-conducting division wall. This energy separation method was proposed by academician Leontiev and was patented in 1998. A number of references for PhD theses, articles, and conference proceedings devoted to the research of “Leontiev tube” have been mentioned in the paper. Efficiency factors for energy separation device performability have been analyzed in detail. The main attention was focused on the phenomenon of shock waves generation in supersonic channel of Leontiev tube.Experiment was carried out in the air prototype of energy separation device with supersonic flow Mach numbers 1.9 and 2.5, stagnation temperatures 40°С and 70°С, and for uni-flow and counter-flow air moving direction in subsonic and supersonic channels. Shock waves have been generated by means of circular ribs in supersonic channel of energy separation device. The research was carried out by means of infrared thermal imaging, thermocouples, total and static pressure probes, and modern National Insturments automation equipment. The work shows that shock waves have no negative influence on energy separation effect. A conclusion is made that unexpected shock wave generation in supersonic channel will not cause operability loss. It was gained that counter-flow regime is more efficient than uni-flow. Energy separation effect also appears to be higher with the rise of Mach number and flow initial stagnation temperature

  15. A rule-based smart automated fertilization and irrigation systems

    Science.gov (United States)

    Yousif, Musab El-Rashid; Ghafar, Khairuddin; Zahari, Rahimi; Lim, Tiong Hoo

    2018-04-01

    Smart automation in industries has become very important as it can improve the reliability and efficiency of the systems. The use of smart technologies in agriculture have increased over the year to ensure and control the production of crop and address food security. However, it is important to use proper irrigation systems avoid water wastage and overfeeding of the plant. In this paper, a Smart Rule-based Automated Fertilization and Irrigation System is proposed and evaluated. We propose a rule based decision making algorithm to monitor and control the food supply to the plant and the soil quality. A build-in alert system is also used to update the farmer using a text message. The system is developed and evaluated using a real hardware.

  16. Oak ridge national laboratory automated clean chemistry for bulk analysis of environmental swipe samples

    Energy Technology Data Exchange (ETDEWEB)

    Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined column calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs

  17. Electrostatic separation of paper and plastics; Separacion por medios electrostaticos de papel, carton y plastico en diferentes calidades

    Energy Technology Data Exchange (ETDEWEB)

    Larrauri, E.; Miguel, R.; Arnaiz, S.; Robertson, C.; Smallwood, J.; Coit, J.; Kohnlecher, R.; Ufer, R.; Evangelou, M.; Karapatakis, S.; Kasi, M.

    1998-12-31

    Development of automated separation technology is essential in increasing recovery rates, particularly from highly mixed and dirty sources such municipal solid wastes, and in reducing recycling costs. This frame moved Gaiker Technological Centre (Spain), Era Technology Ltd. (United Kingdom), Hamos GmbH (Germany) and Komotini Paper Mill (Grecia) to be involved and collaborate with several European partners in the development of generic automated separation and grading of solid materials based on electrostatic techniques. Results derived from this original work are now being successfully applied by the industry to the pilot scale separation and grading of paper and plastic from mixed input streams. Electrostatic separation developed devices are protected under European patents. The European Commission has financed this work under the Brite Euram Program. (Author) 5 refs.

  18. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Safeguards through secure automated fabrication

    International Nuclear Information System (INIS)

    DeMerschman, A.W.; Carlson, R.L.

    1982-01-01

    Westinghouse Hanford Company, a prime contractor for the U.S. Department of Energy, is constructing the Secure Automated Fabrication (SAF) line for fabrication of mixed oxide breeder fuel pins. Fuel processing by automation, which provides a separation of personnel from fuel handling, will provide a means whereby advanced safeguards concepts will be introduced. Remote operations and the inter-tie between the process computer and the safeguards computer are discussed

  20. Summer planetary-scale oscillations: aura MLS temperature compared with ground-based radar wind

    Directory of Open Access Journals (Sweden)

    C. E. Meek

    2009-04-01

    Full Text Available The advent of satellite based sampling brings with it the opportunity to examine virtually any part of the globe. Aura MLS mesospheric temperature data are analysed in a wavelet format for easy identification of possible planetary waves (PW and aliases masquerading as PW. A calendar year, 2005, of eastward, stationary, and westward waves at a selected latitude is shown in separate panels for wave number range −3 to +3 for period range 8 h to 30 days (d. Such a wavelet analysis is made possible by Aura's continuous sampling at all latitudes 82° S–82° N. The data presentation is suitable for examination of years of data. However this paper focuses on the striking feature of a "dish-shaped" upper limit to periods near 2 d in mid-summer, with longer periods appearing towards spring and fall, a feature also commonly seen in radar winds. The most probable cause is suggested to be filtering by the summer jet at 70–80 km, the latter being available from ground based medium frequency radar (MFR. Classically, the phase velocity of a wave must be greater than that of the jet in order to propagate through it. As an attempt to directly relate satellite and ground based sampling, a PW event of period 8d and wave number 2, which appears to be the original rather than an alias, is compared with ground based radar wind data. An appendix discusses characteristics of satellite data aliases with regard to their periods and amplitudes.

  1. Separations Science Data Base: an abstractor's manual

    International Nuclear Information System (INIS)

    Roddy, J.W.; McDowell, W.J.; Michelson, D.C.

    1981-07-01

    The Separations Science Data Base, designed specifically for the retrieval of information needed in chemical separations problems (i.e., how to perform a given separation under given conditions), is described. The procedure for entering records into the data base is given. The initial entries are concerned primarily with liquid-liquid extraction and liquid-solid ion exchange methods for metal ions and salts; however, the data base is constructed so that almost any separations process can be accommodated. Each record is indexed with information provided under the following fields: author; title; publication source; date of publication; organization performing and/or sponsoring the work; brief abstract of the work; abstract number if the work has been so referenced, and/or abstractor's initials; type of separation system used (e.g., flotation); specific or generic name of the separation agent used (e.g., acetylacetone); list of substances separated (e.g., gold, copper); qualitative description of the supporting medium or matrix containing the substances before separation (e.g., nitrate); type of literature where the article was printed (e.g., book); and type of information that the article contains. Each of these fields may be searched independently of the others (or in combination), and the last six fields contain specific key words that are listed on the input form. Definitions are provided for the 39 information terms

  2. Automated voxel-based analysis of brain perfusion SPECT for vasospasm after subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Iwabuchi, S.; Yokouchi, T.; Hayashi, M.; Kimura, H.; Tomiyama, A.; Hirata, Y.; Saito, N.; Harashina, J.; Nakayama, H.; Sato, K.; Aoki, K.; Samejima, H.; Ueda, M.; Terada, H.; Hamazaki, K.

    2008-01-01

    We evaluated regional cerebral blood flow (rCBF) during vasospasm after subarachnoid haemorrhage ISAH) using automated voxel-based analysis of brain perfusion single-photon emission computed tomography (SPELT). Brain perfusion SPECT was performed 7 to 10 days after onset of SAH. Automated voxel-based analysis of SPECT used a Z-score map that was calculated by comparing the patients data with a control database. In cases where computed tomography (CT) scans detected an ischemic region due to vasospasm, automated voxel-based analysis of brain perfusion SPECT revealed dramatically reduced rCBF (Z-score ≤ -4). No patients with mildly or moderately diminished rCBF (Z-score > -3) progressed to cerebral infarction. Some patients with a Z-score < -4 did not progress to cerebral infarction after active treatment with a angioplasty. Three-dimensional images provided detailed anatomical information and helped us to distinguish surgical sequelae from vasospasm. In conclusion, automated voxel-based analysis of brain perfusion SPECT using a Z-score map is helpful in evaluating decreased rCBF due to vasospasm. (author)

  3. Automated rapid chemistry in heavy element research

    International Nuclear Information System (INIS)

    Schaedel, M.

    1994-01-01

    With the increasingly short half-lives of the heavy element isotopes in the transition region from the heaviest actinides to the transactinide elements the demand for automated rapid chemistry techniques is also increasing. Separation times of significantly less than one minute, high chemical yields, high repetition rates, and an adequate detection system are prerequisites for many successful experiments in this field. The development of techniques for separations in the gas phase and in the aqueous phase for applications of chemical or nuclear studies of the heaviest elements are briefly outlined. Typical examples of results obtained with automated techniques are presented for studies up to element 105, especially those obtained with the Automated Rapid Chemistry Apparatus, ARCA. The prospects to investigate the properties of even heavier elements with chemical techniques are discussed

  4. Evaluation of Novel Wet Chemistry Separation and Purification Methods to Facilitate Automation of Astatine-­211 Isolation

    Energy Technology Data Exchange (ETDEWEB)

    Wilbur, Daniel Scott [Univ. of Washington, Seattle, WA (United States)

    2016-07-19

    211At solutions did not appear to change the percent capture, but may have an effect on the % extracted; There was some indication that the PEG-­Merrifield resins could be saturated (perhaps with Bi) resulting in lower capture percentages, but more studies need to be done to confirm that; A target dissolution chamber, designed and built at PNNL, works well with syringe pumps so it can be used in an automated system; Preliminary semi-­automated 211At isolation studies have been conducted with full-scale target dissolution and 211At isolation using a PEG column on the Hamilton automated system gave low overall recoveries, but HNO3 was used (rather than HCl) for loading the 211At and flow rates were not optimized; Results obtained using PEG columns are high enough to warrant further development on a fully automated system; Results obtained also indicate that additional studies are warranted to evaluate other types of columns for 211At separation from bismuth, which allow use of HNO3/HCl mixtures for loading and NaOH for eluting 211At. Such a column could greatly simplify the overall isolation process and make it easier to automate.

  5. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    Energy Technology Data Exchange (ETDEWEB)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Andruch, Vasil, E-mail: vasil.andruch@upjs.sk [Department of Analytical Chemistry, University of P.J. Šafárik, SK-04154 Košice (Slovakia); Moskvin, Leonid [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Bulatov, Andrey, E-mail: bulatov_andrey@mail.ru [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation)

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L{sup −1} Na{sub 2}CO{sub 3}) and the proton donor solution (1 mol L{sup −1} CH{sub 3}COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min{sup −1} during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L{sup −1} of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L{sup −1}. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  6. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    International Nuclear Information System (INIS)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L"−"1 Na_2CO_3) and the proton donor solution (1 mol L"−"1 CH_3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min"−"1 during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L"−"1 of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L"−"1. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  7. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  8. Automated knowledge base development from CAD/CAE databases

    Science.gov (United States)

    Wright, R. Glenn; Blanchard, Mary

    1988-01-01

    Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.

  9. A peristaltic pump driven 89Zr separation module

    DEFF Research Database (Denmark)

    Siikanen, J.; Peterson, M.; Tran, T.

    2012-01-01

    To facilitate the separation of 89Zr produced in yttrium foils, an automated separation module was designed and assembled. The module separates more than 85% of produced 89Zr - activity in 3 g foils in less than 90 min. About 10 % remains in the dissolving vial. The quality of the separated 89Zr...

  10. Practical automation for mature producing areas

    International Nuclear Information System (INIS)

    Luppens, J.C.

    1995-01-01

    Successful installation and operation of supervisory control and data acquisition (SCADA) systems on two US gulf coast platforms, prompted the installation of the first SCADA, or automation, system in Oklahoma in 1989. The initial installation consisted of four remote terminal units (RTU's) at four beam-pumped leases and a PC-based control system communicating by means of a 900-MHz data repeated. This first installation was a building block for additional wells to be automated, and then additional systems, consisting of RTU's, a PC, and a data repeated, were installed. By the end of 1992 there were 98 RTU's operating on five separation systems and additional RTU's are being installed on a regular basis. This paper outlines the logical development of automation systems on properties in Oklahoma operated by Phillips Petroleum Co. Those factors critical to the success of the effort are (1) designing data-gathering and control capability in conjunction with the field operations staff to meet and not exceed their needs; (2) selection of a computer operating system and automation software package; (3) selection of computer, RTU, and end-device hardware; and (4) continuous involvement of the field operations staff in the installation, operation, and maintenance of the systems. Additionally, specific tangible and intangible results are discussed

  11. Estimation of Separation Buffers for Wind-Prediction Error in an Airborne Separation Assistance System

    Science.gov (United States)

    Consiglio, Maria C.; Hoadley, Sherwood T.; Allen, B. Danette

    2009-01-01

    Wind prediction errors are known to affect the performance of automated air traffic management tools that rely on aircraft trajectory predictions. In particular, automated separation assurance tools, planned as part of the NextGen concept of operations, must be designed to account and compensate for the impact of wind prediction errors and other system uncertainties. In this paper we describe a high fidelity batch simulation study designed to estimate the separation distance required to compensate for the effects of wind-prediction errors throughout increasing traffic density on an airborne separation assistance system. These experimental runs are part of the Safety Performance of Airborne Separation experiment suite that examines the safety implications of prediction errors and system uncertainties on airborne separation assurance systems. In this experiment, wind-prediction errors were varied between zero and forty knots while traffic density was increased several times current traffic levels. In order to accurately measure the full unmitigated impact of wind-prediction errors, no uncertainty buffers were added to the separation minima. The goal of the study was to measure the impact of wind-prediction errors in order to estimate the additional separation buffers necessary to preserve separation and to provide a baseline for future analyses. Buffer estimations from this study will be used and verified in upcoming safety evaluation experiments under similar simulation conditions. Results suggest that the strategic airborne separation functions exercised in this experiment can sustain wind prediction errors up to 40kts at current day air traffic density with no additional separation distance buffer and at eight times the current day with no more than a 60% increase in separation distance buffer.

  12. Monitoring geospace disturbances through coordinated space-borne and ground-based magnetometer observations

    Science.gov (United States)

    Balasis, Georgios

    2014-05-01

    Recently automated methods of deriving the characteristics of ultra low frequency (ULF) waves in the magnetosphere have been developed (Balasis et al., 2012, 2013), which can be effectively applied to the huge datasets from the new ESA Swarm mission, in order to retrieve, on an operational basis, new information about the near-Earth electromagnetic environment. Processing Swarm measurements with these methods will help to elucidate the processes influencing the generation and propagation of ULF waves, which in turn play a crucial role in magnetospheric dynamics. Moreover, a useful platform based on a combination of wavelet transforms and artificial neural networks has been developed to monitor the wave evolution from the outer boundaries of Earth's magnetosphere through the topside ionosphere down to the surface. Data from a Low Earth Orbit (LEO) satellite (CHAMP) and two magnetospheric missions (Cluster and Geotail) along with three ground-based magnetic networks (CARISMA, GIMA and IMAGE), during the Halloween 2003 magnetic superstorm when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, are used to demonstrate the potential of the analysis technique in studying wave evolution in detail.

  13. Crew aiding and automation: A system concept for terminal area operations, and guidelines for automation design

    Science.gov (United States)

    Dwyer, John P.

    1994-01-01

    This research and development program comprised two efforts: the development of guidelines for the design of automated systems, with particular emphasis on automation design that takes advantage of contextual information, and the concept-level design of a crew aiding system, the Terminal Area Navigation Decision Aiding Mediator (TANDAM). This concept outlines a system capable of organizing navigation and communication information and assisting the crew in executing the operations required in descent and approach. In service of this endeavor, problem definition activities were conducted that identified terminal area navigation and operational familiarization exercises addressing the terminal area navigation problem. Both airborne and ground-based (ATC) elements of aircraft control were extensively researched. The TANDAM system concept was then specified, and the crew interface and associated systems described. Additionally, three descent and approach scenarios were devised in order to illustrate the principal functions of the TANDAM system concept in relation to the crew, the aircraft, and ATC. A plan for the evaluation of the TANDAM system was established. The guidelines were developed based on reviews of relevant literature, and on experience gained in the design effort.

  14. Characterizing the Vertical Distribution of Aerosols using Ground-based Multiwavelength Lidar Data

    Science.gov (United States)

    Ferrare, R. A.; Thorsen, T. J.; Clayton, M.; Mueller, D.; Chemyakin, E.; Burton, S. P.; Goldsmith, J.; Holz, R.; Kuehn, R.; Eloranta, E. W.; Marais, W.; Newsom, R. K.; Liu, X.; Sawamura, P.; Holben, B. N.; Hostetler, C. A.

    2016-12-01

    Observations of aerosol optical and microphysical properties are critical for developing and evaluating aerosol transport model parameterizations and assessing global aerosol-radiation impacts on climate. During the Combined HSRL And Raman lidar Measurement Study (CHARMS), we investigated the synergistic use of ground-based Raman lidar and High Spectral Resolution Lidar (HSRL) measurements to retrieve aerosol properties aloft. Continuous (24/7) operation of these co-located lidars during the ten-week CHARMS mission (mid-July through September 2015) allowed the acquisition of a unique, multiwavelength ground-based lidar dataset for studying aerosol properties above the Southern Great Plains (SGP) site. The ARM Raman lidar measured profiles of aerosol backscatter, extinction and depolarization at 355 nm as well as profiles of water vapor mixing ratio and temperature. The University of Wisconsin HSRL simultaneously measured profiles of aerosol backscatter, extinction and depolarization at 532 nm and aerosol backscatter at 1064 nm. Recent advances in both lidar retrieval theory and algorithm development demonstrate that vertically-resolved retrievals using such multiwavelength lidar measurements of aerosol backscatter and extinction can help constrain both the aerosol optical (e.g. complex refractive index, scattering, etc.) and microphysical properties (e.g. effective radius, concentrations) as well as provide qualitative aerosol classification. Based on this work, the NASA Langley Research Center (LaRC) HSRL group developed automated algorithms for classifying and retrieving aerosol optical and microphysical properties, demonstrated these retrievals using data from the unique NASA/LaRC airborne multiwavelength HSRL-2 system, and validated the results using coincident airborne in situ data. We apply these algorithms to the CHARMS multiwavelength (Raman+HSRL) lidar dataset to retrieve aerosol properties above the SGP site. We present some profiles of aerosol effective

  15. Triple-channel portable capillary electrophoresis instrument with individual background electrolytes for the concurrent separations of anionic and cationic species

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Thanh Duc; Le, Minh Duc [Centre for Environmental Technology and Sustainable Development (CETASD), Hanoi University of Science, Nguyen Trai Street 334, Hanoi (Viet Nam); Sáiz, Jorge [Department of Analytical Chemistry, Physical Chemistry and Chemical Engineering, University of Alcalá, Ctra. Madrid-Barcelona Km 33.6, Alcalá de Henares, Madrid (Spain); Duong, Hong Anh [Centre for Environmental Technology and Sustainable Development (CETASD), Hanoi University of Science, Nguyen Trai Street 334, Hanoi (Viet Nam); Koenka, Israel Joel [University of Basel, Department of Chemistry, Spitalstrasse 51, 4056 Basel (Switzerland); Pham, Hung Viet, E-mail: phamhungviet@hus.edu.vn [Centre for Environmental Technology and Sustainable Development (CETASD), Hanoi University of Science, Nguyen Trai Street 334, Hanoi (Viet Nam); Hauser, Peter C., E-mail: Peter.Hauser@unibas.ch [University of Basel, Department of Chemistry, Spitalstrasse 51, 4056 Basel (Switzerland)

    2016-03-10

    The portable capillary electrophoresis instrument is automated and features three independent channels with different background electrolytes to allow the concurrent optimized determination of three different categories of charged analytes. The fluidic system is based on a miniature manifold which is based on mechanically milled channels for injection of samples and buffers. The planar manifold pattern was designed to minimize the number of electronic valves required for each channel. The system utilizes pneumatic pressurization to transport solutions at the grounded as well as the high voltage side of the separation capillaries. The instrument has a compact design, with all components arranged in a briefcase with dimensions of 45 (w) × 35 (d) × 15 cm (h) and a weight of about 15 kg. It can operate continuously for 8 h in the battery-powered mode if only one electrophoresis channel is in use, or for about 2.5 h in the case of simultaneous employment of all three channels. The different operations, i.e. capillary flushing, rinsing of the interfaces at both capillary ends, sample injection and electrophoretic separation, are activated automatically with a control program featuring a graphical user interface. For demonstration, the system was employed successfully for the concurrent separation of different inorganic cations and anions, organic preservatives, additives and artificial sweeteners in various beverage and food matrices. - Highlights: • The use of parallel channels allows the concurrent separation of different classes of analytes. • Separate background electrolytes allow individual optimization. • The instrument is compact and field portable.

  16. Triple-channel portable capillary electrophoresis instrument with individual background electrolytes for the concurrent separations of anionic and cationic species

    International Nuclear Information System (INIS)

    Mai, Thanh Duc; Le, Minh Duc; Sáiz, Jorge; Duong, Hong Anh; Koenka, Israel Joel; Pham, Hung Viet; Hauser, Peter C.

    2016-01-01

    The portable capillary electrophoresis instrument is automated and features three independent channels with different background electrolytes to allow the concurrent optimized determination of three different categories of charged analytes. The fluidic system is based on a miniature manifold which is based on mechanically milled channels for injection of samples and buffers. The planar manifold pattern was designed to minimize the number of electronic valves required for each channel. The system utilizes pneumatic pressurization to transport solutions at the grounded as well as the high voltage side of the separation capillaries. The instrument has a compact design, with all components arranged in a briefcase with dimensions of 45 (w) × 35 (d) × 15 cm (h) and a weight of about 15 kg. It can operate continuously for 8 h in the battery-powered mode if only one electrophoresis channel is in use, or for about 2.5 h in the case of simultaneous employment of all three channels. The different operations, i.e. capillary flushing, rinsing of the interfaces at both capillary ends, sample injection and electrophoretic separation, are activated automatically with a control program featuring a graphical user interface. For demonstration, the system was employed successfully for the concurrent separation of different inorganic cations and anions, organic preservatives, additives and artificial sweeteners in various beverage and food matrices. - Highlights: • The use of parallel channels allows the concurrent separation of different classes of analytes. • Separate background electrolytes allow individual optimization. • The instrument is compact and field portable.

  17. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  18. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  19. Automated waste canister docking and emplacement using a sensor-based intelligent controller

    International Nuclear Information System (INIS)

    Drotning, W.D.

    1992-08-01

    A sensor-based intelligent control system is described that utilizes a multiple degree-of-freedom robotic system for the automated remote manipulation and precision docking of large payloads such as waste canisters. Computer vision and ultrasonic proximity sensing are used to control the automated precision docking of a large object with a passive target cavity. Real-time sensor processing and model-based analysis are used to control payload position to a precision of ± 0.5 millimeter

  20. Development of a web-based CANDU core management procedures automation system

    International Nuclear Information System (INIS)

    Lee, S.; Park, D.; Yeom, C.; Suh, H.

    2007-01-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  1. Development of a web-based CANDU core management procedures automation system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.; Park, D.; Yeom, C. [Inst. for Advanced Engineering (IAE), Yongin (Korea, Republic of); Suh, H. [Korea Hydro and Nuclear Power (KHNP), Wolsong (Korea, Republic of)

    2007-07-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  2. Hydrogeology, simulated ground-water flow, and ground-water quality, Wright-Patterson Air Force Base, Ohio

    Science.gov (United States)

    Dumouchelle, D.H.; Schalk, C.W.; Rowe, G.L.; De Roche, J.T.

    1993-01-01

    Ground water is the primary source of water in the Wright-Patterson Air Force Base area. The aquifer consists of glacial sands and gravels that fill a buried bedrock-valley system. Consolidated rocks in the area consist of poorly permeable Ordovician shale of the Richmondian stage, in the upland areas, the Brassfield Limestone of Silurian age. The valleys are filled with glacial sediments of Wisconsinan age consisting of clay-rich tills and coarse-grained outwash deposits. Estimates of hydraulic conductivity of the shales based on results of displacement/recovery tests range from 0.0016 to 12 feet per day; estimates for the glacial sediments range from less than 1 foot per day to more than 1,000 feet per day. Ground water flow from the uplands towards the valleys and the major rivers in the region, the Great Miami and the Mad Rivers. Hydraulic-head data indicate that ground water flows between the bedrock and unconsolidated deposits. Data from a gain/loss study of the Mad River System and hydrographs from nearby wells reveal that the reach of the river next to Wright-Patterson Air Force Base is a ground-water discharge area. A steady-state, three-dimensional ground-water-flow model was developed to simulate ground-water flow in the region. The model contains three layers and encompasses about 100 square miles centered on Wright-Patterson Air Force Base. Ground water enters the modeled area primarily by river leakage and underflow at the model boundary. Ground water exits the modeled area primarily by flow through the valleys at the model boundaries and through production wells. A model sensitivity analysis involving systematic changes in values of hydrologic parameters in the model indicates that the model is most sensitive to decreases in riverbed conductance and vertical conductance between the upper two layers. The analysis also indicates that the contribution of water to the buried-valley aquifer from the bedrock that forms the valley walls is about 2 to 4

  3. Data Mining for Understanding and Improving Decision-making Affecting Ground Delay Programs

    Science.gov (United States)

    Kulkarni, Deepak; Wang, Yao; Sridhar, Banavar

    2013-01-01

    The continuous growth in the demand for air transportation results in an imbalance between airspace capacity and traffic demand. The airspace capacity of a region depends on the ability of the system to maintain safe separation between aircraft in the region. In addition to growing demand, the airspace capacity is severely limited by convective weather. During such conditions, traffic managers at the FAA's Air Traffic Control System Command Center (ATCSCC) and dispatchers at various Airlines' Operations Center (AOC) collaborate to mitigate the demand-capacity imbalance caused by weather. The end result is the implementation of a set of Traffic Flow Management (TFM) initiatives such as ground delay programs, reroute advisories, flow metering, and ground stops. Data Mining is the automated process of analyzing large sets of data and then extracting patterns in the data. Data mining tools are capable of predicting behaviors and future trends, allowing an organization to benefit from past experience in making knowledge-driven decisions.

  4. Modal-pushover-based ground-motion scaling procedure

    Science.gov (United States)

    Kalkan, Erol; Chopra, Anil K.

    2011-01-01

    Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  5. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  6. HAUTO: Automated composition of convergent services based in HTN planning

    Directory of Open Access Journals (Sweden)

    Armando Ordoñez

    2014-01-01

    Full Text Available This paper presents HAUTO, a framework able to compose convergent services automatically. HAUTO is based in HTN (hierarchical task networks Automated Planning and is composed of three modules: a request processing module that transforms natural language and context information into a planning instance, the automated composition module based on HTN planning and the execution environment for convergent (Web and telecom services. The integration of a planning component provides two basic functionalities: the possibility of customizing the composition of services using the user context information and a middleware level that integrates the execution of services in high performance telecom environments. Finally, a prototype in environmental early warning management is presented as a test case.

  7. Automated pathologies detection in retina digital images based on complex continuous wavelet transform phase angles.

    Science.gov (United States)

    Lahmiri, Salim; Gargour, Christian S; Gabrea, Marcel

    2014-10-01

    An automated diagnosis system that uses complex continuous wavelet transform (CWT) to process retina digital images and support vector machines (SVMs) for classification purposes is presented. In particular, each retina image is transformed into two one-dimensional signals by concatenating image rows and columns separately. The mathematical norm of phase angles found in each one-dimensional signal at each level of CWT decomposition are relied on to characterise the texture of normal images against abnormal images affected by exudates, drusen and microaneurysms. The leave-one-out cross-validation method was adopted to conduct experiments and the results from the SVM show that the proposed approach gives better results than those obtained by other methods based on the correct classification rate, sensitivity and specificity.

  8. Generic Challenges and Automation Solutions in Manufacturing SMEs

    DEFF Research Database (Denmark)

    Grube Hansen, David; Malik, Ali Ahmad; Bilberg, Arne

    2017-01-01

    Evermore research is conducted on smart manufacturing, digital manufacturing and other aspects of what is expected from the fourth industrial revolution known as Industry 4.0. Most of the research of Industry 4.0 is currently a better fit for large corporations than for SMEs, which in Europe...... of the project, identifies a correlation between the challenges, age and size of the companies. The identified correlation lay ground for an Industry 4.0 light concept, targeting the identified generic challenges of companies employing 10-50 people. The solutions presented are based on cloud computing, Internet...... however represent 98% of the manufacturing industry. In general, SMEs produce high-mix low-volume products, which require a high degree of flexibility. Historically flexibility and automation have been contradictory, but as automation becomes smarter, digitalized and less expensive, this may change which...

  9. State-based modeling of continuous human-integrated systems: An application to air traffic separation assurance

    International Nuclear Information System (INIS)

    Landry, Steven J.; Lagu, Amit; Kinnari, Jouko

    2010-01-01

    A method for modeling the safety of human-integrated systems that have continuous dynamics is introduced. The method is intended to supplement more detailed reliability-based methods. Assumptions for the model are defined such that the model is demonstrably complete, enabling it to yield a set of key agent characteristics. These key characteristics identify a sufficient set of characteristics that can be used to establish the safety of particular system configurations. The method is applied for the analysis of the safety of strategic and tactical separation assurance algorithms for the next generation air transportation system. It is shown that the key characteristics for this problem include the ability of agents (human or automated) to identify configurations that can enable intense transitions from a safe to unsafe state. However, the most technologically advanced algorithm for separation assurance does not currently attempt to identify such configurations. It is also discussed how, although the model is in a form that lends itself to quantitative evaluations, such evaluations are complicated by the difficulty of accurately quantifying human error probabilities.

  10. PHOTOGRAMMETRY-BASED AUTOMATED MEASUREMENTS FOR TOOTH SHAPE AND OCCLUSION ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available Tooth measurements (odontometry are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation – one of the most responsible clinical procedures in prosthetic dentistry – within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  11. Photogrammetry-Based Automated Measurements for Tooth Shape and Occlusion Analysis

    Science.gov (United States)

    Knyaz, V. A.; Gaboutchian, A. V.

    2016-06-01

    Tooth measurements (odontometry) are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation - one of the most responsible clinical procedures in prosthetic dentistry - within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  12. Arduino-based automation of a DNA extraction system.

    Science.gov (United States)

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  13. Design of automation tools for management of descent traffic

    Science.gov (United States)

    Erzberger, Heinz; Nedell, William

    1988-01-01

    The design of an automated air traffic control system based on a hierarchy of advisory tools for controllers is described. Compatibility of the tools with the human controller, a key objective of the design, is achieved by a judicious selection of tasks to be automated and careful attention to the design of the controller system interface. The design comprises three interconnected subsystems referred to as the Traffic Management Advisor, the Descent Advisor, and the Final Approach Spacing Tool. Each of these subsystems provides a collection of tools for specific controller positions and tasks. This paper focuses primarily on the Descent Advisor which provides automation tools for managing descent traffic. The algorithms, automation modes, and graphical interfaces incorporated in the design are described. Information generated by the Descent Advisor tools is integrated into a plan view traffic display consisting of a high-resolution color monitor. Estimated arrival times of aircraft are presented graphically on a time line, which is also used interactively in combination with a mouse input device to select and schedule arrival times. Other graphical markers indicate the location of the fuel-optimum top-of-descent point and the predicted separation distances of aircraft at a designated time-control point. Computer generated advisories provide speed and descent clearances which the controller can issue to aircraft to help them arrive at the feeder gate at the scheduled times or with specified separation distances. Two types of horizontal guidance modes, selectable by the controller, provide markers for managing the horizontal flightpaths of aircraft under various conditions. The entire system consisting of descent advisor algorithm, a library of aircraft performance models, national airspace system data bases, and interactive display software has been implemented on a workstation made by Sun Microsystems, Inc. It is planned to use this configuration in operational

  14. Fingerprint separation: an application of ICA

    Science.gov (United States)

    Singh, Meenakshi; Singh, Deepak Kumar; Kalra, Prem Kumar

    2008-04-01

    Among all existing biometric techniques, fingerprint-based identification is the oldest method, which has been successfully used in numerous applications. Fingerprint-based identification is the most recognized tool in biometrics because of its reliability and accuracy. Fingerprint identification is done by matching questioned and known friction skin ridge impressions from fingers, palms, and toes to determine if the impressions are from the same finger (or palm, toe, etc.). There are many fingerprint matching algorithms which automate and facilitate the job of fingerprint matching, but for any of these algorithms matching can be difficult if the fingerprints are overlapped or mixed. In this paper, we have proposed a new algorithm for separating overlapped or mixed fingerprints so that the performance of the matching algorithms will improve when they are fed with these inputs. Independent Component Analysis (ICA) has been used as a tool to separate the overlapped or mixed fingerprints.

  15. Scaling earthquake ground motions for performance-based assessment of buildings

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  16. Methods of physical experiment and installation automation on the base of computers

    International Nuclear Information System (INIS)

    Stupin, Yu.V.

    1983-01-01

    Peculiarities of using computers for physical experiment and installation automation are considered. Systems for data acquisition and processing on the base of microprocessors, micro- and mini-computers, CAMAC equipment and real time operational systems as well as systems intended for automation of physical experiments on accelerators and installations of laser thermonuclear fusion and installations for plasma investigation are dpscribed. The problems of multimachine complex and multi-user system, arrangement, development of automated systems for collective use, arrangement of intermachine data exchange and control of experimental data base are discussed. Data on software systems used for complex experimental data processing are presented. It is concluded that application of new computers in combination with new possibilities provided for users by universal operational systems essentially exceeds efficiency of a scientist work

  17. Automated procedure execution for space vehicle autonomous control

    Science.gov (United States)

    Broten, Thomas A.; Brown, David A.

    1990-01-01

    Increased operational autonomy and reduced operating costs have become critical design objectives in next-generation NASA and DoD space programs. The objective is to develop a semi-automated system for intelligent spacecraft operations support. The Spacecraft Operations and Anomaly Resolution System (SOARS) is presented as a standardized, model-based architecture for performing High-Level Tasking, Status Monitoring and automated Procedure Execution Control for a variety of spacecraft. The particular focus is on the Procedure Execution Control module. A hierarchical procedure network is proposed as the fundamental means for specifying and representing arbitrary operational procedures. A separate procedure interpreter controls automatic execution of the procedure, taking into account the current status of the spacecraft as maintained in an object-oriented spacecraft model.

  18. Hanford Ground-Water Data Base management guide

    International Nuclear Information System (INIS)

    Rieger, J.T.; Mitchell, P.J.; Muffett, D.M.; Fruland, R.M.; Moore, S.B.; Marshall, S.M.

    1990-02-01

    This guide describes the Hanford Ground-Water Data Base (HGWDB), a computerized data base used to store hydraulic head, sample analytical, temperature, geologic, and well-structure information for ground-water monitoring wells on the Hanford Site. These data are stored for the purpose of data retrieval for report generation and also for historical purposes. This guide is intended as an aid to the data base manager and the various staff authorized to enter and verify data, maintain the data base, and maintain the supporting software. This guide focuses on the structure of the HGWDB, providing a fairly detailed description of the programs, files, and parameters. Data-retrieval instructions for the general user of the HGWDB will be found in the HGWDB User's Manual. 6 figs

  19. Automation technology for aerospace power management

    Science.gov (United States)

    Larsen, R. L.

    1982-01-01

    The growing size and complexity of spacecraft power systems coupled with limited space/ground communications necessitate increasingly automated onboard control systems. Research in computer science, particularly artificial intelligence has developed methods and techniques for constructing man-machine systems with problem-solving expertise in limited domains which may contribute to the automation of power systems. Since these systems perform tasks which are typically performed by human experts they have become known as Expert Systems. A review of the current state of the art in expert systems technology is presented, and potential applications in power systems management are considered. It is concluded that expert systems appear to have significant potential for improving the productivity of operations personnel in aerospace applications, and in automating the control of many aerospace systems.

  20. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  1. Human-automation collaboration in manufacturing: identifying key implementation factors

    OpenAIRE

    Charalambous, George; Fletcher, Sarah; Webb, Philip

    2013-01-01

    Human-automation collaboration refers to the concept of human operators and intelligent automation working together interactively within the same workspace without conventional physical separation. This concept has commanded significant attention in manufacturing because of the potential applications, such as the installation of large sub-assemblies. However, the key human factors relevant to human-automation collaboration have not yet been fully investigated. To maximise effective implement...

  2. Automation of radioimmunoassays

    International Nuclear Information System (INIS)

    Goldie, D.J.; West, P.M.; Ismail, A.A.A.

    1979-01-01

    A short account is given of recent developments in automation of the RIA technique. Difficulties encountered in the incubation, separation and quantitation steps are summarized. Published references are given to a number of systems, both discrete and continuous flow, and details are given of a system developed by the present authors. (U.K.)

  3. Aviation System Capacity Program Terminal Area Productivity Project: Ground and Airborne Technologies

    Science.gov (United States)

    Giulianetti, Demo J.

    2001-01-01

    Ground and airborne technologies were developed in the Terminal Area Productivity (TAP) project for increasing throughput at major airports by safely maintaining good-weather operating capacity during bad weather. Methods were demonstrated for accurately predicting vortices to prevent wake-turbulence encounters and to reduce in-trail separation requirements for aircraft approaching the same runway for landing. Technology was demonstrated that safely enabled independent simultaneous approaches in poor weather conditions to parallel runways spaced less than 3,400 ft apart. Guidance, control, and situation-awareness systems were developed to reduce congestion in airport surface operations resulting from the increased throughput, particularly during night and instrument meteorological conditions (IMC). These systems decreased runway occupancy time by safely and smoothly decelerating the aircraft, increasing taxi speed, and safely steering the aircraft off the runway. Simulations were performed in which optimal trajectories were determined by air traffic control (ATC) and communicated to flight crews by means of Center TRACON Automation System/Flight Management System (CTASFMS) automation to reduce flight delays, increase throughput, and ensure flight safety.

  4. A LabVIEW based template for user created experiment automation.

    Science.gov (United States)

    Kim, D J; Fisk, Z

    2012-12-01

    We have developed an expandable software template to automate user created experiments. The LabVIEW based template is easily modifiable to add together user created measurements, controls, and data logging with virtually any type of laboratory equipment. We use reentrant sequential selection to implement sequence script making it possible to wrap a long series of the user created experiments and execute them in sequence. Details of software structure and application examples for scanning probe microscope and automated transport experiments using custom built laboratory electronics and a cryostat are described.

  5. SONG-China Project: A Global Automated Observation Network

    Science.gov (United States)

    Yang, Z. Z.; Lu, X. M.; Tian, J. F.; Zhuang, C. G.; Wang, K.; Deng, L. C.

    2017-09-01

    Driven by advancements in technology and scientific objectives, data acquisition in observational astronomy has been changed greatly in recent years. Fully automated or even autonomous ground-based network of telescopes has now become a tendency for time-domain observational projects. The Stellar Observations Network Group (SONG) is an international collaboration with the participation and contribution of the Chinese astronomy community. The scientific goal of SONG is time-domain astrophysics such as asteroseismology and open cluster research. The SONG project aims to build a global network of 1 m telescopes equipped with high-precision and high-resolution spectrographs, and two-channel lucky-imaging cameras. It is the Chinese initiative to install a 50 cm binocular photometry telescope at each SONG node sharing the network platform and infrastructure. This work is focused on design and implementation in technology and methodology of SONG/50BiN, a typical ground-based network composed of multiple sites and a variety of instruments.

  6. Development of an automated asbestos counting software based on fluorescence microscopy.

    Science.gov (United States)

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  7. Separation Systems Data Base: a users' manual

    International Nuclear Information System (INIS)

    Roddy, J.W.; McDowell, W.J.

    1978-11-01

    A data base designed specifically for the retrieval of information needed in chemical separations problems is described. Included are descriptions of the basic methods of searching and retrieving information from the data base, the procedure for entering records into the data base, a listing of additional references concerning the computer information process and an example of a typical record. The initial entries are concerned primarily with liquid-liquid extraction methods for metal ions and salts. However, the data base is constructed so that almost any separation process can be accommodated

  8. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  9. Automated Extraction of 3D Trees from Mobile LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    Y. Yu

    2014-06-01

    Full Text Available This paper presents an automated algorithm for extracting 3D trees directly from 3D mobile light detection and ranging (LiDAR data. To reduce both computational and spatial complexities, ground points are first filtered out from a raw 3D point cloud via blockbased elevation filtering. Off-ground points are then grouped into clusters representing individual objects through Euclidean distance clustering and voxel-based normalized cut segmentation. Finally, a model-driven method is proposed to achieve the extraction of 3D trees based on a pairwise 3D shape descriptor. The proposed algorithm is tested using a set of mobile LiDAR point clouds acquired by a RIEGL VMX-450 system. The results demonstrate the feasibility and effectiveness of the proposed algorithm.

  10. Feasibility evaluation of 3 automated cellular drug screening assays on a robotic workstation.

    Science.gov (United States)

    Soikkeli, Anne; Sempio, Cristina; Kaukonen, Ann Marie; Urtti, Arto; Hirvonen, Jouni; Yliperttula, Marjo

    2010-01-01

    This study presents the implementation and optimization of 3 cell-based assays on a TECAN Genesis workstation-the Caspase-Glo 3/7 and sulforhodamine B (SRB) screening assays and the mechanistic Caco-2 permeability protocol-and evaluates their feasibility for automation. During implementation, the dispensing speed to add drug solutions and fixative trichloroacetic acid and the aspiration speed to remove the supernatant immediately after fixation were optimized. Decontamination steps for cleaning the tips and pipetting tubing were also added. The automated Caspase-Glo 3/7 screen was successfully optimized with Caco-2 cells (Z' 0.7, signal-to-base ratio [S/B] 1.7) but not with DU-145 cells. In contrast, the automated SRB screen was successfully optimized with the DU-145 cells (Z' 0.8, S/B 2.4) but not with the Caco-2 cells (Z' -0.8, S/B 1.4). The automated bidirectional Caco-2 permeability experiments separated successfully low- and high-permeability compounds (Z' 0.8, S/B 84.2) and passive drug permeation from efflux-mediated transport (Z' 0.5, S/B 8.6). Of the assays, the homogeneous Caspase-Glo 3/7 assay benefits the most from automation, but also the heterogeneous SRB assay and Caco-2 permeability experiments gain advantages from automation.

  11. Miniaturized protein separation using a liquid chromatography column on a flexible substrate

    International Nuclear Information System (INIS)

    Yang Yongmo; Chae, Junseok

    2008-01-01

    We report a prototype protein separator that successfully miniaturizes existing technology for potential use in biocompatible health monitoring implants. The prototype is a liquid chromatography (LC) column (LC mini-column) fabricated on an inexpensive, flexible, biocompatible polydimethylsiloxane (PDMS) enclosure. The LC mini-column separates a mixture of proteins using size exclusion chromatography (SEC) with polydivinylbenzene beads (5–20 µm in diameter with 10 nm pore size). The LC mini-column is smaller than any commercially available LC column by a factor of ∼11 000 and successfully separates denatured and native protein mixtures at ∼71 psi of the applied fluidic pressure. Separated proteins are analyzed using NuPAGE-gel electrophoresis, high-performance liquid chromatography (HPLC) and an automated electrophoresis system. Quantitative HPLC results demonstrate successful separation based on intensity change: within 12 min, the intensity between large and small protein peaks changed by a factor of ∼20. In further evaluation using the automated electrophoresis system, the plate height of the LC mini-column is between 36 µm and 100 µm. The prototype LC mini-column shows the potential for real-time health monitoring in applications that require inexpensive, flexible implant technology that can function effectively under non-laboratory conditions

  12. Optimizing human-system interface automation design based on a skill-rule-knowledge framework

    International Nuclear Information System (INIS)

    Lin, Chiuhsiang Joe; Yenn, T.-C.; Yang, C.-W.

    2010-01-01

    This study considers the technological change that has occurred in complex systems within the past 30 years. The role of human operators in controlling and interacting with complex systems following the technological change was also investigated. Modernization of instrumentation and control systems and components leads to a new issue of human-automation interaction, in which human operational performance must be considered in automated systems. The human-automation interaction can differ in its types and levels. A system design issue is usually realized: given these technical capabilities, which system functions should be automated and to what extent? A good automation design can be achieved by making an appropriate human-automation function allocation. To our knowledge, only a few studies have been published on how to achieve appropriate automation design with a systematic procedure. Further, there is a surprising lack of information on examining and validating the influences of levels of automation (LOAs) on instrumentation and control systems in the advanced control room (ACR). The study we present in this paper proposed a systematic framework to help in making an appropriate decision towards types of automation (TOA) and LOAs based on a 'Skill-Rule-Knowledge' (SRK) model. From the evaluating results, it was shown that the use of either automatic mode or semiautomatic mode is insufficient to prevent human errors. For preventing the occurrences of human errors and ensuring the safety in ACR, the proposed framework can be valuable for making decisions in human-automation allocation.

  13. Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy.

    Science.gov (United States)

    Welikala, R A; Fraz, M M; Dehmeshki, J; Hoppe, A; Tah, V; Mann, S; Williamson, T H; Barman, S A

    2015-07-01

    Proliferative diabetic retinopathy (PDR) is a condition that carries a high risk of severe visual impairment. The hallmark of PDR is the growth of abnormal new vessels. In this paper, an automated method for the detection of new vessels from retinal images is presented. This method is based on a dual classification approach. Two vessel segmentation approaches are applied to create two separate binary vessel map which each hold vital information. Local morphology features are measured from each binary vessel map to produce two separate 4-D feature vectors. Independent classification is performed for each feature vector using a support vector machine (SVM) classifier. The system then combines these individual outcomes to produce a final decision. This is followed by the creation of additional features to generate 21-D feature vectors, which feed into a genetic algorithm based feature selection approach with the objective of finding feature subsets that improve the performance of the classification. Sensitivity and specificity results using a dataset of 60 images are 0.9138 and 0.9600, respectively, on a per patch basis and 1.000 and 0.975, respectively, on a per image basis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Conceptual design of distillation-based hybrid separation processes.

    Science.gov (United States)

    Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang

    2013-01-01

    Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.

  15. Adaptive Automation Based on Air Traffic Controller Decision-Making

    NARCIS (Netherlands)

    IJtsma (Student TU Delft), Martijn; Borst, C.; Mercado Velasco, G.A.; Mulder, M.; van Paassen, M.M.; Tsang, P.S.; Vidulich, M.A.

    2017-01-01

    Through smart scheduling and triggering of automation support, adaptive automation has the potential to balance air traffic controller workload. The challenge in the design of adaptive automation systems is to decide how and when the automation should provide support. This paper describes the design

  16. Drilling Automation Demonstrations in Subsurface Exploration for Astrobiology

    Science.gov (United States)

    Glass, Brian; Cannon, H.; Lee, P.; Hanagud, S.; Davis, K.

    2006-01-01

    This project proposes to study subsurface permafrost microbial habitats at a relevant Arctic Mars-analog site (Haughton Crater, Devon Island, Canada) while developing and maturing the subsurface drilling and drilling automation technologies that will be required by post-2010 missions. It builds on earlier drilling technology projects to add permafrost and ice-drilling capabilities to 5m with a lightweight drill that will be automatically monitored and controlled in-situ. Frozen cores obtained with this drill under sterilized protocols will be used in testing three hypotheses pertaining to near-surface physical geology and ground H2O ice distribution, viewed as a habitat for microbial life in subsurface ice and ice-consolidated sediments. Automation technologies employed will demonstrate hands-off diagnostics and drill control, using novel vibrational dynamical analysis methods and model-based reasoning to monitor and identify drilling fault states before and during faults. Three field deployments, to a Mars-analog site with frozen impact crater fallback breccia, will support science goals, provide a rigorous test of drilling automation and lightweight permafrost drilling, and leverage past experience with the field site s particular logistics.

  17. An open-source automated continuous condition-based maintenance platform for commercial buildings

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    2016-09-09

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  18. Automated synthesis of photovoltaic-quality colloidal quantum dots using separate nucleation and growth stages

    KAUST Repository

    Pan, Jun

    2013-11-26

    As colloidal quantum dot (CQD) optoelectronic devices continue to improve, interest grows in the scaled-up and automated synthesis of high-quality materials. Unfortunately, all reports of record-performance CQD photovoltaics have been based on small-scale batch syntheses. Here we report a strategy for flow reactor synthesis of PbS CQDs and prove that it leads to solar cells having performance similar to that of comparable batch-synthesized nanoparticles. Specifically, we find that, only when using a dual-temperature-stage flow reactor synthesis reported herein, are the CQDs of sufficient quality to achieve high performance. We use a kinetic model to explain and optimize the nucleation and growth processes in the reactor. Compared to conventional single-stage flow-synthesized CQDs, we achieve superior quality nanocrystals via the optimized dual-stage reactor, with high photoluminescence quantum yield (50%) and narrow full width-half-maximum. The dual-stage flow reactor approach, with its versatility and rapid screening of multiple parameters, combined with its efficient materials utilization, offers an attractive path to automated synthesis of CQDs for photovoltaics and, more broadly, active optoelectronics. © 2013 American Chemical Society.

  19. An automated approach for segmentation of intravascular ultrasound images based on parametric active contour models

    International Nuclear Information System (INIS)

    Vard, Alireza; Jamshidi, Kamal; Movahhedinia, Naser

    2012-01-01

    This paper presents a fully automated approach to detect the intima and media-adventitia borders in intravascular ultrasound images based on parametric active contour models. To detect the intima border, we compute a new image feature applying a combination of short-term autocorrelations calculated for the contour pixels. These feature values are employed to define an energy function of the active contour called normalized cumulative short-term autocorrelation. Exploiting this energy function, the intima border is separated accurately from the blood region contaminated by high speckle noise. To extract media-adventitia boundary, we define a new form of energy function based on edge, texture and spring forces for the active contour. Utilizing this active contour, the media-adventitia border is identified correctly even in presence of branch openings and calcifications. Experimental results indicate accuracy of the proposed methods. In addition, statistical analysis demonstrates high conformity between manual tracing and the results obtained by the proposed approaches.

  20. Management by Trajectory Trade Study of Roles and Responsibilities Between Participants and Automation Report

    Science.gov (United States)

    Fernandes, Alicia D.; Kaler, Curt; Leiden, Kenneth; Atkins, Stephen; Bell, Alan; Kilbourne, Todd; Evans, Mark

    2017-01-01

    This report describes a trade study of roles and responsibilities associated with the Management by Trajectory (MBT) concept. The MBT concept describes roles, responsibilities, and information and automation requirements for providing air traffic controllers and managers the ability to quickly generate, evaluate and implement changes to an aircraft's trajectory. In addition, the MBT concept describes mechanisms for imposing constraints on flight operator preferred trajectories only to the extent necessary to maintain safe and efficient traffic flows, and the concept provides a method for the exchange of trajectory information between ground automation systems and the aircraft that allows for trajectory synchronization and trajectory negotiation. The participant roles considered in this trade study include: airline dispatcher, flight crew, radar controller, traffic manager, and Air Traffic Control System Command Center (ATCSCC) traffic management specialists. The proposed allocation of roles and responsibilities was based on analysis of several use cases that were developed for this purpose as well as for walking through concept elements. The resulting allocation of roles and responsibilities reflects both increased automation capability to support many aviation functions, as well as increased flexibility to assign responsibilities to different participants - in many cases afforded by the increased automation capabilities. Note that the selection of participants to consider for allocation of each function is necessarily rooted in the current environment, in that MBT is envisioned as an evolution of the National Airspace System (NAS), and not a revolution. A key feature of the MBT allocations is a vision for the traffic management specialist to take on a greater role. This is facilitated by the vision that separation management functions, in addition to traffic management functions, will be carried out as trajectory management functions. This creates an opportunity

  1. PINE-SPARKY.2 for automated NMR-based protein structure research.

    Science.gov (United States)

    Lee, Woonghee; Markley, John L

    2018-05-01

    Nuclear magnetic resonance (NMR) spectroscopy, along with X-ray crystallography and cryoelectron microscopy, is one of the three major tools that enable the determination of atomic-level structural models of biological macromolecules. Of these, NMR has the unique ability to follow important processes in solution, including conformational changes, internal dynamics and protein-ligand interactions. As a means for facilitating the handling and analysis of spectra involved in these types of NMR studies, we have developed PINE-SPARKY.2, a software package that integrates and automates discrete tasks that previously required interaction with separate software packages. The graphical user interface of PINE-SPARKY.2 simplifies chemical shift assignment and verification, automated detection of secondary structural elements, predictions of flexibility and hydrophobic cores, and calculation of three-dimensional structural models. PINE-SPARKY.2 is available in the latest version of NMRFAM-SPARKY from the National Magnetic Resonance Facility at Madison (http://pine.nmrfam.wisc.edu/download_packages.html), the NMRbox Project (https://nmrbox.org) and to subscribers to the SBGrid (https://sbgrid.org). For a detailed description of the program, see http://www.nmrfam.wisc.edu/pine-sparky2.htm. whlee@nmrfam.wisc.edu or markley@nmrfam.wisc.edu. Supplementary data are available at Bioinformatics online.

  2. AUTOMATED FEATURE BASED TLS DATA REGISTRATION FOR 3D BUILDING MODELING

    OpenAIRE

    K. Kitamura; N. Kochi; S. Kaneko

    2012-01-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not re...

  3. Foundation Investigation for Ground Based Radar Project-Kwajalein Island, Marshall Islands

    Science.gov (United States)

    1990-04-01

    iL_ COPY MISCELLANEOUS PAPER GL-90-5 i iFOUNDATION INVESTIGATION FOR GROUND BASED RADAR PROJECT--KWAJALEIN ISLAND, MARSHALL ISLANDS by Donald E...C!assification) Foundatioa Investigation for Ground Based Radar Project -- Kwajalein Island, Marshall Islands 12. PERSONAL AUTHOR(S) Yule, Donald E...investigation for the Ground Based Radar Project -- Kwajalein Island, Marshall Islands , are presented.- eophysical tests comprised of surface refrac- tion

  4. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2004-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles, which entails integration of sample treatment and separation chemistries and radiometric detection within a single functional analytical instrument. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high-ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid-state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will

  5. Coordinated Ground-Based Observations and the New Horizons Fly-by of Pluto

    Science.gov (United States)

    Young, Eliot; Young, Leslie; Parker, Joel; Binzel, Richard

    2015-04-01

    The New Horizons (NH) spacecraft is scheduled to make its closest approach to Pluto on July 14, 2015. NH carries seven scientific instruments, including separate UV and Visible-IR spectrographs, a long-focal-length imager, two plasma-sensing instruments and a dust counter. There are three arenas in particular in which ground-based observations should augment the NH instrument suite in synergistic ways: IR spectra at wavelengths longer than 2.5 µm (i.e., longer than the NH Ralph spectrograph), stellar occultation observations near the time of the fly-by, and thermal surface maps and atmospheric CO abundances based on ALMA observations - we discuss the first two of these. IR spectra in the 3 - 5 µm range cover the CH4 absorption band near 3.3 µm. This band can be an important constraint on the state and areal extent of nitrogen frost on Pluto's surface. If this band depth is close to zero (as was observed by Olkin et al. 2007), it limits the area of nitrogen frost, which is bright at that wavelength. Combined with the NH observations of nitrogen frost at 2.15 µm, the ground-based spectra will determine how much nitrogen frost is diluted with methane, which is a basic constraint on the seasonal cycle of sublimation and condensation that takes place on Pluto (and similar objects like Triton and Eris). There is a fortuitous stellar occultation by Pluto on 29-JUN-2015, only two weeks before the NH closest approach. The occulted star will be the brightest ever observed in a Pluto event, about 2 magnitudes brighter than Pluto itself. The track of the event is predicted to cover parts of Australia and New Zealand. Thanks to HST and ground based campaigns to find a TNO target reachable by NH, the position of the shadow path will be known at the +/-100 km level, allowing SOFIA and mobile ground-based observers to reliably cover the central flash region. Ground-based & SOFIA observations in visible and IR wavelengths will characterize the haze opacity and vertical

  6. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2003-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high- ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will provide the basis for designing effective instrumentation for radioanalytical process monitoring. Specific analytical targets include 99 Tc, 90Sr and

  7. Magnetic separations in biotechnology.

    Science.gov (United States)

    Borlido, L; Azevedo, A M; Roque, A C A; Aires-Barros, M R

    2013-12-01

    Magnetic separations are probably one of the most versatile separation processes in biotechnology as they are able to purify cells, viruses, proteins and nucleic acids directly from crude samples. The fast and gentle process in combination with its easy scale-up and automation provide unique advantages over other separation techniques. In the midst of this process are the magnetic adsorbents tailored for the envisioned target and whose complex synthesis spans over multiple fields of science. In this context, this article reviews both the synthesis and tailoring of magnetic adsorbents for bioseparations as well as their ultimate application. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  9. Orbital transfer vehicle launch operations study: Automated technology knowledge base, volume 4

    Science.gov (United States)

    1986-01-01

    A simplified retrieval strategy for compiling automation-related bibliographies from NASA/RECON is presented. Two subsets of NASA Thesaurus subject terms were extracted: a primary list, which is used to obtain an initial set of citations; and a secondary list, which is used to limit or further specify a large initial set of citations. These subject term lists are presented in Appendix A as the Automated Technology Knowledge Base (ATKB) Thesaurus.

  10. Possibilities for automating coal sampling

    Energy Technology Data Exchange (ETDEWEB)

    Helekal, J; Vankova, J

    1987-11-01

    Outlines sampling equipment in use (AVR-, AVP-, AVN- and AVK-series samplers and RDK- and RDH-series separators produced by the Coal Research Institute, Ostrava; extractors, crushers and separators produced by ORGREZ). The Ostrava equipment covers bituminous coal needs while ORGREZ provides equipment for energy coal requirements. This equipment is designed to handle coal up to 200 mm in size at a throughput of up to 1200 t/h. Automation of sampling equipment is foreseen.

  11. The Environmental Control and Life Support System (ECLSS) advanced automation project

    Science.gov (United States)

    Dewberry, Brandon S.; Carnes, Ray

    1990-01-01

    The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.

  12. Digital coal mine integrated automation system based on Controlnet

    Energy Technology Data Exchange (ETDEWEB)

    Jin-yun Chen; Shen Zhang; Wei-ran Zuo [China University of Mining and Technology, Xuzhou (China). School of Chemical Engineering and Technology

    2007-06-15

    A three-layer model for digital communication in a mine is proposed. Two basic platforms are discussed: a uniform transmission network and a uniform data warehouse. An actual, ControlNet based, transmission network platform suitable for the Jining No.3 coal mine in China is presented. This network is an information superhighway intended to integrate all existing and new automation subsystems. Its standard interface can be used with future subsystems. The network, data structure and management decision-making all employ this uniform hardware and software. This effectively avoids the problems of system and information islands seen in traditional mine-automation systems. The construction of the network provides a stable foundation for digital communication in the Jining No.3 coal mine. 9 refs., 5 figs.

  13. Ground Robotic Hand Applications for the Space Program study (GRASP)

    Science.gov (United States)

    Grissom, William A.; Rafla, Nader I. (Editor)

    1992-01-01

    This document reports on a NASA-STDP effort to address research interests of the NASA Kennedy Space Center (KSC) through a study entitled, Ground Robotic-Hand Applications for the Space Program (GRASP). The primary objective of the GRASP study was to identify beneficial applications of specialized end-effectors and robotic hand devices for automating any ground operations which are performed at the Kennedy Space Center. Thus, operations for expendable vehicles, the Space Shuttle and its components, and all payloads were included in the study. Typical benefits of automating operations, or augmenting human operators performing physical tasks, include: reduced costs; enhanced safety and reliability; and reduced processing turnaround time.

  14. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  15. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  16. WIRELESS HOME AUTOMATION SYSTEM BASED ON MICROCONTROLLER

    Directory of Open Access Journals (Sweden)

    MUNA H. SALEH

    2017-11-01

    Full Text Available This paper presents the development of Global System Mobile (GSM-based control home air-conditioner for home automation system. The main aim of the prototype development is to reduce electricity wastage. GSM module was used for receiving Short Message Service (SMS from the user’s mobile phone that automatically enable the controller to take any further action such as to switch ON and OFF the home air-conditioner. The system controls the air-conditioner based on the temperature reading through the sensor. Every period temperature sensor sends the degree to Micro Controller Unit (MCU through ZigBee. Based on temperature degree MCU send ON or OFF signal to switch. Additionally, system allows user to operate or shut down the airconditioner remotely through SMS.

  17. Personal computer based home automation system

    OpenAIRE

    Hellmuth, George F.

    1993-01-01

    The systems engineering process is applied in the development of the preliminary design of a home automation communication protocol. The objective of the communication protocol is to provide a means for a personal computer to communicate with adapted appliances in the home. A needs analysis is used to ascertain that a need exist for a home automation system. Numerous design alternatives are suggested and evaluated to determine the best possible protocol design. Coaxial cable...

  18. The environmental control and life support system advanced automation project

    Science.gov (United States)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  19. Android based security and home automation system

    OpenAIRE

    Khan, Sadeque Reza; Dristy, Farzana Sultana

    2015-01-01

    The smart mobile terminal operator platform Android is getting popular all over the world with its wide variety of applications and enormous use in numerous spheres of our daily life. Considering the fact of increasing demand of home security and automation, an Android based control system is presented in this paper where the proposed system can maintain the security of home main entrance and also the car door lock. Another important feature of the designed system is that it can control the o...

  20. Traveling around Cape Horn: Otolith chemistry reveals a mixed stock of Patagonian hoki with separate Atlantic and Pacific spawning grounds

    Science.gov (United States)

    Schuchert, P.C.; Arkhipkin, A.I.; Koenig, A.E.

    2010-01-01

    Trace element fingerprints of edge and core regions in otoliths from 260 specimens of Patagonian hoki, Macruronus magellanicus L??nnberg, 1907, were analyzed by LA-ICPMS to reveal whether this species forms one or more population units (stocks) in the Southern Oceans. Fish were caught on their spawning grounds in Chile and feeding grounds in Chile and the Falkland Islands. Univariate and multivariate analyses of trace element concentrations in the otolith edges, which relate to the adult life of fish, could not distinguish between Atlantic (Falkland) and Pacific (Chile) hoki. Cluster analyses of element concentrations in the otolith edges produced three different clusters in all sample areas indicating high mixture of the stocks. Cluster analysis of trace element concentrations in the otolith cores, relating to juvenile and larval life stages, produced two separate clusters mainly distinguished by 137Ba concentrations. The results suggest that Patagonian hoki is a highly mixed fish stock with at least two spawning grounds around South America. ?? 2009 Elsevier B.V.

  1. Enabling Advanced Automation in Spacecraft Operations with the Spacecraft Emergency Response System

    Science.gov (United States)

    Breed, Julie; Fox, Jeffrey A.; Powers, Edward I. (Technical Monitor)

    2001-01-01

    True autonomy is the Holy Grail of spacecraft mission operations. The goal of launching a satellite and letting it manage itself throughout its useful life is a worthy one. With true autonomy, the cost of mission operations would be reduced to a negligible amount. Under full autonomy, any problems (no matter the severity or type) that may arise with the spacecraft would be handled without any human intervention via some combination of smart sensors, on-board intelligence, and/or smart automated ground system. Until the day that complete autonomy is practical and affordable to deploy, incremental steps of deploying ever-increasing levels of automation (computerization of once manual tasks) on the ground and on the spacecraft are gradually decreasing the cost of mission operations. For example, NASA's Goddard Space Flight Center (NASA-GSFC) has been flying spacecraft with low cost operations for several years. NASA-GSFC's SMEX (Small Explorer) and MIDEX (Middle Explorer) missions have effectively deployed significant amounts of automation to enable the missions to fly predominately in 'light-out' mode. Under light-out operations the ground system is run without human intervention. Various tools perform many of the tasks previously performed by the human operators. One of the major issues in reducing human staff in favor of automation is the perceived increased in risk of losing data, or even losing a spacecraft, because of anomalous conditions that may occur when there is no one in the control center. When things go wrong, missions deploying advanced automation need to be sure that anomalous conditions are detected and that key personal are notified in a timely manner so that on-call team members can react to those conditions. To ensure the health and safety of its lights-out missions, NASA-GSFC's Advanced Automation and Autonomy branch (Code 588) developed the Spacecraft Emergency Response System (SERS). The SERS is a Web-based collaborative environment that enables

  2. Automated dating of the world’s language families based on lexical similarity

    OpenAIRE

    Holman, E.; Brown, C.; Wichmann, S.; Müller, A.; Velupillai, V.; Hammarström, H.; Sauppe, S.; Jung, H.; Bakker, D.; Brown, P.; Belyaev, O.; Urban, M.; Mailhammer, R.; List, J.; Egorov, D.

    2011-01-01

    This paper describes a computerized alternative to glottochronology for estimating elapsed time since parent languages diverged into daughter languages. The method, developed by the Automated Similarity Judgment Program (ASJP) consortium, is different from glottochronology in four major respects: (1) it is automated and thus is more objective, (2) it applies a uniform analytical approach to a single database of worldwide languages, (3) it is based on lexical similarity as determined from Leve...

  3. Automated Bug Assignment: Ensemble-based Machine Learning in Large Scale Industrial Contexts

    OpenAIRE

    Jonsson, Leif; Borg, Markus; Broman, David; Sandahl, Kristian; Eldh, Sigrid; Runeson, Per

    2016-01-01

    Bug report assignment is an important part of software maintenance. In particular, incorrect assignments of bug reports to development teams can be very expensive in large software development projects. Several studies propose automating bug assignment techniques using machine learning in open source software contexts, but no study exists for large-scale proprietary projects in industry. The goal of this study is to evaluate automated bug assignment techniques that are based on machine learni...

  4. BigBOSS: The Ground-Based Stage IV BAO Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, David; Bebek, Chris; Heetderks, Henry; Ho, Shirley; Lampton, Michael; Levi, Michael; Mostek, Nick; Padmanabhan, Nikhil; Perlmutter, Saul; Roe, Natalie; Sholl, Michael; Smoot, George; White, Martin; Dey, Arjun; Abraham, Tony; Jannuzi, Buell; Joyce, Dick; Liang, Ming; Merrill, Mike; Olsen, Knut; Salim, Samir

    2009-04-01

    The BigBOSS experiment is a proposed DOE-NSF Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with an all-sky galaxy redshift survey. The project is designed to unlock the mystery of dark energy using existing ground-based facilities operated by NOAO. A new 4000-fiber R=5000 spectrograph covering a 3-degree diameter field will measure BAO and redshift space distortions in the distribution of galaxies and hydrogen gas spanning redshifts from 0.2< z< 3.5. The Dark Energy Task Force figure of merit (DETF FoM) for this experiment is expected to be equal to that of a JDEM mission for BAO with the lower risk and cost typical of a ground-based experiment.

  5. Becoming Earth Independent: Human-Automation-Robotics Integration Challenges for Future Space Exploration

    Science.gov (United States)

    Marquez, Jessica J.

    2016-01-01

    Future exploration missions will require NASA to integrate more automation and robotics in order to accomplish mission objectives. This presentation will describe on the future challenges facing the human operator (astronaut, ground controllers) as we increase the amount of automation and robotics in spaceflight operations. It will describe how future exploration missions will have to adapt and evolve in order to deal with more complex missions and communication latencies. This presentation will outline future human-automation-robotic integration challenges.

  6. Smart GSM Based Home Automation System

    OpenAIRE

    Teymourzadeh, Rozita

    2013-01-01

    This research work investigates the potential of ‘Full Home Control’, which is the aim of the Home Automation Systems in near future. The analysis and implementation of the home automation technology using Global System for Mobile Communication (GSM) modem to control home appliances such as light, conditional system, and security system via Short Message Service (SMS) text messages is presented in this paper. The proposed research work is focused on the functionality of the GSM protocol, whic...

  7. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA).

    Science.gov (United States)

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder; Chong, Ving Ching

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies.

  8. Ground-water contamination at Wurtsmith Air Force Base, Michigan

    Science.gov (United States)

    Stark, J.R.; Cummings, T.R.; Twenter, F.R.

    1983-01-01

    A sand and gravel aquifer of glacial origin underlies Wurtsmith Air Force Base in northeastern lower Michigan. The aquifer overlies a thick clay layer at an average depth of 65 feet. The water table is about 10 feet below land surface in the western part of the Base and about 25 feet below land surface in the eastern part. A ground-water divide cuts diagonally across the Base from northwest to southeast. South of the divide, ground water flows to the Au Sable River; north of the divide, it flows to Van Etten Creek and Van Etten Lake. Mathematical models were used to aid in calculating rates of groundwater flow. Rates range from about 0.8 feet per day in the eastern part of the Base to about 0.3 feet per day in the western part. Models also were used as an aid in making decisions regarding purging of contaminated water from the aquifer. In 1977, trichloroethylene was detected in the Air Force Base water-supply system. It had leaked from a buried storage tank near Building 43 in the southeastern part of the Base and moved northeastward under the influence of the natural ground-water gradient and the pumping of Base water-supply wells. In the most highly contaminated part of the plume, concentrations are greater than 1,000 micrograms per liter. Current purge pumping is removing some of the trichloroethylene, and seems to have arrested its eastward movement. Pumping of additional purge wells could increase the rate of removal. Trichloroethylene has also been detected in ground water in the vicinity of the Base alert apron, where a plume from an unknown source extends northeastward off Base. A smaller, less well-defined area of contamination also occurs just north of the larger plume. Trichloroethylene, identified near the waste-treatment plant, seepage lagoons, and the northern landfill area, is related to activities and operations in these areas. Dichloroethylene and trichloroethylene occur in significant quantities westward of Building 43, upgradient from the major

  9. Microfluidic separation of viruses from blood cells based on intrinsic transport processes.

    Science.gov (United States)

    Zhao, Chao; Cheng, Xuanhong

    2011-09-01

    Clinical analysis of acute viral infection in blood requires the separation of viral particles from blood cells, since the cytoplasmic enzyme inhibits the subsequent viral detection. To facilitate this procedure in settings without access to a centrifuge, we present a microfluidic device to continuously purify bionanoparticles from cells based on their different intrinsic movements on the microscale. In this device, a biological sample is layered on top of a physiological buffer, and both fluids are transported horizontally at the same flow rate in a straight channel under laminar flow. While the micron sized particles such as cells sediment to the bottom layer with a predictable terminal velocity, the nanoparticles move vertically by diffusion. As their vertical travel distances have a different dependence on time, the micro- and nanoparticles can preferentially reside in the bottom and top layers respectively after certain residence time, yielding purified viruses. We first performed numerical analysis to predicate the particle separation and then tested the theory using suspensions of synthetic particles and biological samples. The experimental results using dilute synthetic particles closely matched the numerical analysis of a two layer flow system containing different sized particles. Similar purification was achieved using diluted blood spiked with human immunodeficiency virus. However, viral purification in whole blood is compromised due to extensive bioparticle collisions. With the parallelization and automation potential offered by microfluidics, this device has the potential to function as an upstream sample preparation module to continuously provide cell depleted bio-nanoparticles for downstream analysis.

  10. Multisensor Equipped Uav/ugv for Automated Exploration

    Science.gov (United States)

    Batzdorfer, S.; Bobbe, M.; Becker, M.; Harms, H.; Bestmann, U.

    2017-08-01

    The usage of unmanned systems for exploring disaster scenarios has become more and more important in recent times as a supporting system for action forces. These systems have to offer a well-balanced relationship between the quality of support and additional workload. Therefore within the joint research project ANKommEn - german acronym for Automated Navigation and Communication for Exploration - a system for exploration of disaster scenarios is build-up using multiple UAV und UGV controlled via a central ground station. The ground station serves as user interface for defining missions and tasks conducted by the unmanned systems, equipped with different environmental sensors like cameras - RGB as well as IR - or LiDAR. Depending on the exploration task results, in form of pictures, 2D stitched orthophoto or LiDAR point clouds will be transmitted via datalinks and displayed online at the ground station or will be processed in short-term after a mission, e.g. 3D photogrammetry. For mission planning and its execution, UAV/UGV monitoring and georeferencing of environmental sensor data, reliable positioning and attitude information is required. This is gathered using an integrated GNSS/IMU positioning system. In order to increase availability of positioning information in GNSS challenging scenarios, a GNSS-Multiconstellation based approach is used, amongst others. The present paper focuses on the overall system design including the ground station and sensor setups on the UAVs and UGVs, the underlying positioning techniques as well as 2D and 3D exploration based on a RGB camera mounted on board the UAV and its evaluation based on real world field tests.

  11. MULTISENSOR EQUIPPED UAV/UGV FOR AUTOMATED EXPLORATION

    Directory of Open Access Journals (Sweden)

    S. Batzdorfer

    2017-08-01

    Full Text Available The usage of unmanned systems for exploring disaster scenarios has become more and more important in recent times as a supporting system for action forces. These systems have to offer a well-balanced relationship between the quality of support and additional workload. Therefore within the joint research project ANKommEn – german acronym for Automated Navigation and Communication for Exploration – a system for exploration of disaster scenarios is build-up using multiple UAV und UGV controlled via a central ground station. The ground station serves as user interface for defining missions and tasks conducted by the unmanned systems, equipped with different environmental sensors like cameras – RGB as well as IR – or LiDAR. Depending on the exploration task results, in form of pictures, 2D stitched orthophoto or LiDAR point clouds will be transmitted via datalinks and displayed online at the ground station or will be processed in short-term after a mission, e.g. 3D photogrammetry. For mission planning and its execution, UAV/UGV monitoring and georeferencing of environmental sensor data, reliable positioning and attitude information is required. This is gathered using an integrated GNSS/IMU positioning system. In order to increase availability of positioning information in GNSS challenging scenarios, a GNSS-Multiconstellation based approach is used, amongst others. The present paper focuses on the overall system design including the ground station and sensor setups on the UAVs and UGVs, the underlying positioning techniques as well as 2D and 3D exploration based on a RGB camera mounted on board the UAV and its evaluation based on real world field tests.

  12. “The Naming of Cats”: Automated Genre Classification

    Directory of Open Access Journals (Sweden)

    Yunhyong Kim

    2007-07-01

    Full Text Available This paper builds on the work presented at the ECDL 2006 in automated genre classification as a step toward automating metadata extraction from digital documents for ingest into digital repositories such as those run by archives, libraries and eprint services (Kim & Ross, 2006b. We have previously proposed dividing features of a document into five types (features for visual layout, language model features, stylometric features, features for semantic structure, and contextual features as an object linked to previously classified objects and other external sources and have examined visual and language model features. The current paper compares results from testing classifiers based on image and stylometric features in a binary classification to show that certain genres have strong image features which enable effective separation of documents belonging to the genre from a large pool of other documents.

  13. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  14. Visual Servoing-Based Nanorobotic System for Automated Electrical Characterization of Nanotubes inside SEM.

    Science.gov (United States)

    Ding, Huiyang; Shi, Chaoyang; Ma, Li; Yang, Zhan; Wang, Mingyu; Wang, Yaqiong; Chen, Tao; Sun, Lining; Toshio, Fukuda

    2018-04-08

    The maneuvering and electrical characterization of nanotubes inside a scanning electron microscope (SEM) has historically been time-consuming and laborious for operators. Before the development of automated nanomanipulation-enabled techniques for the performance of pick-and-place and characterization of nanoobjects, these functions were still incomplete and largely operated manually. In this paper, a dual-probe nanomanipulation system vision-based feedback was demonstrated to automatically perform 3D nanomanipulation tasks, to investigate the electrical characterization of nanotubes. The XY-position of Atomic Force Microscope (AFM) cantilevers and individual carbon nanotubes (CNTs) were precisely recognized via a series of image processing operations. A coarse-to-fine positioning strategy in the Z-direction was applied through the combination of the sharpness-based depth estimation method and the contact-detection method. The use of nanorobotic magnification-regulated speed aided in improving working efficiency and reliability. Additionally, we proposed automated alignment of manipulator axes by visual tracking the movement trajectory of the end effector. The experimental results indicate the system's capability for automated measurement electrical characterization of CNTs. Furthermore, the automated nanomanipulation system has the potential to be extended to other nanomanipulation tasks.

  15. An Automated System for Incubation of Pelagic Fish Eggs

    Directory of Open Access Journals (Sweden)

    Leif Jørgensen

    1987-01-01

    Full Text Available An automated system for incubation of pelagic fish eggs is described. The system has an internal air driven water circulation which separates healthy and dead or strongly infected eggs. A processor controlled, pulsed water exchange provides a strongly reduced water requirement. The equipment has also an automated temperature and salinity control and adjustment.

  16. Automated Individual Prescription of Exercise with an XML-based Expert System.

    Science.gov (United States)

    Jang, S; Park, S R; Jang, Y; Park, J; Yoon, Y; Park, S

    2005-01-01

    Continuously motivating people to exercise regularly is more important than finding a barriers such as lack of time, cost of equipment or gym membership, lack of nearby facilities and poor weather or night-time lighting. Our proposed system presents practicable methods of motivation through a web-based exercise prescription service. Users are instructed to exercise according to their physical ability by means of an automated individual prescription of exercise checked and approved by a personal trainer or exercise specialist after being tested with the HIMS, fitness assessment system. Furthermore, utilizing BIOFIT exercise prescriptions scheduled by an expert system can help users exercise systematically. Automated individual prescriptions are built in XML based documents because the data needs to be flexible, expansible and convertible structures to process diverse exercise templates. Web-based exercise prescription service makes users stay interested in exercise even if they live in many different environments.

  17. Automated Identification of Northern Leaf Blight-Infected Maize Plants from Field Imagery Using Deep Learning.

    Science.gov (United States)

    DeChant, Chad; Wiesner-Hanks, Tyr; Chen, Siyuan; Stewart, Ethan L; Yosinski, Jason; Gore, Michael A; Nelson, Rebecca J; Lipson, Hod

    2017-11-01

    Northern leaf blight (NLB) can cause severe yield loss in maize; however, scouting large areas to accurately diagnose the disease is time consuming and difficult. We demonstrate a system capable of automatically identifying NLB lesions in field-acquired images of maize plants with high reliability. This approach uses a computational pipeline of convolutional neural networks (CNNs) that addresses the challenges of limited data and the myriad irregularities that appear in images of field-grown plants. Several CNNs were trained to classify small regions of images as containing NLB lesions or not; their predictions were combined into separate heat maps, then fed into a final CNN trained to classify the entire image as containing diseased plants or not. The system achieved 96.7% accuracy on test set images not used in training. We suggest that such systems mounted on aerial- or ground-based vehicles can help in automated high-throughput plant phenotyping, precision breeding for disease resistance, and reduced pesticide use through targeted application across a variety of plant and disease categories.

  18. Separations systems data base: a users' manual. Revision I

    International Nuclear Information System (INIS)

    Roddy, J.W.; McDowell, W.J.

    1981-01-01

    A separations systems data base (SEPSYS), designed specifically for the retrieval of information needed in chemical separations problems (i.e., how to perform a given separation under given conditions), is described. Included are descriptions of the basic methods of searching and retrieving information from the data base, the procedure for entering records into the data base, a listing of additional references concerning the computer information process, and an example of a typical record. The initial entries are concerned primarily with liquid-liquid extraction and liquid-solid ion exchange methods for metal ions and salts; however, the data base is constructed so that almost any separation process can be accommodated. Each record is indexed with information provided under the following fields: author; title; publication source; data of publication; organization sponsoring the work; brief abstract of the work; abstract number if the work has been so referenced, and/or abstractors initials; type of separation system used (e.g., flotation); specific or generic name of the separation agent used (e.g., acetylacetone); list of substances separated (e.g., gold, copper); qualitative description of the supporting medium or matrix containing the substances before separation (e.g., nitrate); type of literature where the record was printed (e.g., book); and type of information that the article contains. Each of these fields may be searched independently of the others (or in combination), and the last six fields contain specific key words that are listed in the report. Definitions are provided for the 36 information terms

  19. Space Station Initial Operational Concept (IOC) operations and safety view - Automation and robotics for Space Station

    Science.gov (United States)

    Bates, William V., Jr.

    1989-01-01

    The automation and robotics requirements for the Space Station Initial Operational Concept (IOC) are discussed. The amount of tasks to be performed by an eight-person crew, the need for an automated or directed fault analysis capability, and ground support requirements are considered. Issues important in determining the role of automation for the IOC are listed.

  20. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...

  1. Workload Capacity: A Response Time-Based Measure of Automation Dependence.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2016-05-01

    An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.

  2. A Ground Systems Template for Remote Sensing Systems

    Science.gov (United States)

    McClanahan, Timothy P.; Trombka, Jacob I.; Floyd, Samuel R.; Truskowski, Walter; Starr, Richard D.; Clark, Pamela E.; Evans, Larry G.

    2002-10-01

    Spaceborne remote sensing using gamma and X-ray spectrometers requires particular attention to the design and development of reliable systems. These systems must ensure the scientific requirements of the mission within the challenging technical constraints of operating instrumentation in space. The Near Earth Asteroid Rendezvous (NEAR) spacecraft included X-ray and gamma-ray spectrometers (XGRS), whose mission was to map the elemental chemistry of the 433 Eros asteroid. A remote sensing system template, similar to a blackboard systems approach used in artificial intelligence, was identified in which the spacecraft, instrument, and ground system was designed and developed to monitor and adapt to evolving mission requirements in a complicated operational setting. Systems were developed for ground tracking of instrument calibration, instrument health, data quality, orbital geometry, solar flux as well as models of the asteroid's surface characteristics, requiring an intensive human effort. In the future, missions such as the Autonomous Nano-Technology Swarm (ANTS) program will have to rely heavily on automation to collectively encounter and sample asteroids in the outer asteroid belt. Using similar instrumentation, ANTS will require information similar to data collected by the NEAR X-ray/Gamma-Ray Spectrometer (XGRS) ground system for science and operations management. The NEAR XGRS systems will be studied to identify the equivalent subsystems that may be automated for ANTS. The effort will also investigate the possibility of applying blackboard style approaches to automated decision making required for ANTS.

  3. A ground systems template for remote sensing systems

    International Nuclear Information System (INIS)

    McClanahan, Timothy P.; Trombka, Jacob I.; Floyd, Samuel R.; Truskowski, Walter; Starr, Richard D.; Clark, Pamela E.; Evans, Larry G.

    2002-01-01

    Spaceborne remote sensing using gamma and X-ray spectrometers requires particular attention to the design and development of reliable systems. These systems must ensure the scientific requirements of the mission within the challenging technical constraints of operating instrumentation in space. The Near Earth Asteroid Rendezvous (NEAR) spacecraft included X-ray and gamma-ray spectrometers (XGRS), whose mission was to map the elemental chemistry of the 433 Eros asteroid. A remote sensing system template, similar to a blackboard systems approach used in artificial intelligence, was identified in which the spacecraft, instrument, and ground system was designed and developed to monitor and adapt to evolving mission requirements in a complicated operational setting. Systems were developed for ground tracking of instrument calibration, instrument health, data quality, orbital geometry, solar flux as well as models of the asteroid's surface characteristics, requiring an intensive human effort. In the future, missions such as the Autonomous Nano-Technology Swarm (ANTS) program will have to rely heavily on automation to collectively encounter and sample asteroids in the outer asteroid belt. Using similar instrumentation, ANTS will require information similar to data collected by the NEAR X-ray/Gamma-Ray Spectrometer (XGRS) ground system for science and operations management. The NEAR XGRS systems will be studied to identify the equivalent subsystems that may be automated for ANTS. The effort will also investigate the possibility of applying blackboard style approaches to automated decision making required for ANTS

  4. Web-based automation of green building rating index and life cycle cost analysis

    Science.gov (United States)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  5. NextGen Technologies on the FAA's Standard Terminal Automation Replacement System

    Science.gov (United States)

    Witzberger, Kevin; Swenson, Harry; Martin, Lynne; Lin, Melody; Cheng, Jinn-Hwei

    2014-01-01

    This paper describes the integration, evaluation, and results from a high-fidelity human-in-the-loop (HITL) simulation of key NASA Air Traffic Management Technology Demonstration - 1 (ATD- 1) technologies implemented in an enhanced version of the FAA's Standard Terminal Automation Replacement System (STARS) platform. These ATD-1 technologies include: (1) a NASA enhanced version of the FAA's Time-Based Flow Management, (2) a NASA ground-based automation technology known as controller-managed spacing (CMS), and (3) a NASA advanced avionics airborne technology known as flight-deck interval management (FIM). These ATD-1 technologies have been extensively tested in large-scale HITL simulations using general-purpose workstations to study air transportation technologies. These general purpose workstations perform multiple functions and are collectively referred to as the Multi-Aircraft Control System (MACS). Researchers at NASA Ames Research Center and Raytheon collaborated to augment the STARS platform by including CMS and FIM advisory tools to validate the feasibility of integrating these automation enhancements into the current FAA automation infrastructure. NASA Ames acquired three STARS terminal controller workstations, and then integrated the ATD-1 technologies. HITL simulations were conducted to evaluate the ATD-1 technologies when using the STARS platform. These results were compared with the results obtained when the ATD-1 technologies were tested in the MACS environment. Results collected from the numerical data show acceptably minor differences, and, together with the subjective controller questionnaires showing a trend towards preferring STARS, validate the ATD-1/STARS integration.

  6. The Automated Threaded Fastening Based on On-line Identification

    Directory of Open Access Journals (Sweden)

    Nicolas Ivan Giannoccaro

    2008-11-01

    Full Text Available The principle of the thread fastenings have been known and used for decades with the purpose of joining one component to another. Threaded fastenings are popular because they permit easy disassembly for maintenance, repair, relocation and recycling. Screw insertions are typically carried out manually. It is a difficult problem to automat. As a result there is very little published research on automating threaded fastenings, and most research on automated assembly focus on the peg-in-hole assembly problem. This paper investigates the problem of automated monitoring of the screw insertion process. The monitoring problem deals with predicting integrity of a threaded insertion, based on the torque vs. insertion depth curve generated during the insertions. The authors have developed an analytical model to predict the torque signature signals during self-tapping screw insertions. However, the model requires parameters on the screw dimensions and plate material properties are difficult to measure. This paper presents a study on on-line identification during screw fastenings. An identification methodology for two unknown parameter estimation during a self-tapping screw insertion process is presented. It is shown that friction and screw properties required by the model can be reliably estimated on-line. Experimental results are presented to validate the identification procedure.

  7. Automated 2D peptide separation on a 1D nano-LC-MS system

    DEFF Research Database (Denmark)

    Taylor, Paul; Nielsen, Peter A; Trelle, Morten Beck

    2009-01-01

    the on-line separation of highly complex peptide mixtures directly coupled with mass spectrometry-based identification. Here, we present a variation of the traditional MudPIT protocol, combining highly sensitive chromatography using a nanoflow liquid chromatography system (nano-LC) with a two...

  8. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    Science.gov (United States)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  9. Implementation and design of a communication system of an agent-based automated substation

    Institute of Scientific and Technical Information of China (English)

    LIN Yong-jun; LIU Yu-tao; ZHANG Dan-hui

    2006-01-01

    A substation system requires that communication be transmitted reliably,accurately and in real-time.Aimed at solving problems,e.g.,flow confliction and sensitive data transmission,a model of the communication system of an agent-based automated substation is introduced.The running principle is discussed in detail and each type of agent is discussed further.At the end,the realization of the agent system applied to the substation is presented.The outcome shows that the communication system of an agent-based automated substation improves the accuracy and reliability of the data transfer and presents it in realtime.

  10. An open trial of Acceptance-based Separated Family Treatment (ASFT) for adolescents with anorexia nervosa.

    Science.gov (United States)

    Timko, C Alix; Zucker, Nancy L; Herbert, James D; Rodriguez, Daniel; Merwin, Rhonda M

    2015-06-01

    Family based-treatments have the most empirical support in the treatment of adolescent anorexia nervosa; yet, a significant percentage of adolescents and their families do not respond to manualized family based treatment (FBT). The aim of this open trial was to conduct a preliminary evaluation of an innovative family-based approach to the treatment of anorexia: Acceptance-based Separated Family Treatment (ASFT). Treatment was grounded in Acceptance and Commitment Therapy (ACT), delivered in a separated format, and included an ACT-informed skills program. Adolescents (ages 12-18) with anorexia or sub-threshold anorexia and their families received 20 treatment sessions over 24 weeks. Outcome indices included eating disorder symptomatology reported by the parent and adolescent, percentage of expected body weight achieved, and changes in psychological acceptance/avoidance. Half of the adolescents (48.0%) met criteria for full remission at the end of treatment, 29.8% met criteria for partial remission, and 21.3% did not improve. Overall, adolescents had a significant reduction in eating disorder symptoms and reached expected body weight. Treatment resulted in changes in psychological acceptance in the expected direction for both parents and adolescents. This open trial provides preliminary evidence for the feasibility, acceptability, and efficacy of ASFT for adolescents with anorexia. Directions for future research are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices

    Science.gov (United States)

    2018-01-01

    The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario. PMID:29748468

  12. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices.

    Science.gov (United States)

    Muñoz, Sergio; Araque, Oscar; Sánchez-Rada, J Fernando; Iglesias, Carlos A

    2018-05-10

    The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario.

  13. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices

    Directory of Open Access Journals (Sweden)

    Sergio Muñoz

    2018-05-01

    Full Text Available The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i the design of an emotion aware automation platform architecture for smart offices; (ii the semantic modelling of the system; and (iii the implementation and evaluation of the proposed architecture in a real scenario.

  14. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    Science.gov (United States)

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  15. A fully automated cell segmentation and morphometric parameter system for quantifying corneal endothelial cell morphology.

    Science.gov (United States)

    Al-Fahdawi, Shumoos; Qahwaji, Rami; Al-Waisy, Alaa S; Ipson, Stanley; Ferdousi, Maryam; Malik, Rayaz A; Brahma, Arun

    2018-07-01

    Corneal endothelial cell abnormalities may be associated with a number of corneal and systemic diseases. Damage to the endothelial cells can significantly affect corneal transparency by altering hydration of the corneal stroma, which can lead to irreversible endothelial cell pathology requiring corneal transplantation. To date, quantitative analysis of endothelial cell abnormalities has been manually performed by ophthalmologists using time consuming and highly subjective semi-automatic tools, which require an operator interaction. We developed and applied a fully-automated and real-time system, termed the Corneal Endothelium Analysis System (CEAS) for the segmentation and computation of endothelial cells in images of the human cornea obtained by in vivo corneal confocal microscopy. First, a Fast Fourier Transform (FFT) Band-pass filter is applied to reduce noise and enhance the image quality to make the cells more visible. Secondly, endothelial cell boundaries are detected using watershed transformations and Voronoi tessellations to accurately quantify the morphological parameters of the human corneal endothelial cells. The performance of the automated segmentation system was tested against manually traced ground-truth images based on a database consisting of 40 corneal confocal endothelial cell images in terms of segmentation accuracy and obtained clinical features. In addition, the robustness and efficiency of the proposed CEAS system were compared with manually obtained cell densities using a separate database of 40 images from controls (n = 11), obese subjects (n = 16) and patients with diabetes (n = 13). The Pearson correlation coefficient between automated and manual endothelial cell densities is 0.9 (p system, and the possibility of utilizing it in a real world clinical setting to enable rapid diagnosis and for patient follow-up, with an execution time of only 6 seconds per image. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    International Nuclear Information System (INIS)

    Yoshida, Nozomu

    2008-01-01

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term ''base''. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface

  17. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  18. Shape based automated detection of pulmonary nodules with surface feature based false positive reduction

    International Nuclear Information System (INIS)

    Nomura, Y.; Itoh, H.; Masutani, Y.; Ohtomo, K.; Maeda, E.; Yoshikawa, T.; Hayashi, N.

    2007-01-01

    We proposed a shape based automated detection of pulmonary nodules with surface feature based false positive (FP) reduction. In the proposed system, the FP existing in internal of vessel bifurcation is removed using extracted surface of vessels and nodules. From the validation with 16 chest CT scans, we find that the proposed CAD system achieves 18.7 FPs/scan at 90% sensitivity, and 7.8 FPs/scan at 80% sensitivity. (orig.)

  19. Automated analysis of slitless spectra. II. Quasars

    International Nuclear Information System (INIS)

    Edwards, G.; Beauchemin, M.; Borra, F.

    1988-01-01

    Automated software have been developed to process slitless spectra. The software, described in a previous paper, automatically separates stars from extended objects and quasars from stars. This paper describes the quasar search techniques and discusses the results. The performance of the software is compared and calibrated with a plate taken in a region of SA 57 that has been extensively surveyed by others using a variety of techniques: the proposed automated software performs very well. It is found that an eye search of the same plate is less complete than the automated search: surveys that rely on eye searches suffer from incompleteness at least from a magnitude brighter than the plate limit. It is shown how the complete automated analysis of a plate and computer simulations are used to calibrate and understand the characteristics of the present data. 20 references

  20. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  1. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  2. A new fully automated FTIR system for total column measurements of greenhouse gases

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  3. New Trends in Agent-Based Complex Automated Negotiations

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Fatima, Shaheen; Matsuo, Tokuro

    2012-01-01

    Complex Automated Negotiations represent an important, emerging area in the field of Autonomous Agents and Multi-Agent Systems. Automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. These factors include the number of issues, dependencies between these issues,  representation of utilities, the negotiation protocol, the number of parties in the negotiation (bilateral or multi-party), time constraints, etc. Software agents can support automation or simulation of such complex negotiations on the behalf of their owners, and can provide them with efficient bargaining strategies. To realize such a complex automated negotiation, we have to incorporate advanced Artificial Intelligence technologies includes search, CSP, graphical utility models, Bayes nets, auctions, utility graphs, predicting and learning methods. Applications could include e-commerce tools, decision-making support tools, negotiation support tools, collaboration tools, etc. This book aims to pro...

  4. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    International Nuclear Information System (INIS)

    Bottaro, Marcio; Nagy, Balazs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral

    2017-01-01

    Introduction: To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods: Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naive human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results: In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion: As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer's edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients. (author)

  5. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Directory of Open Access Journals (Sweden)

    Márcio Bottaro

    Full Text Available Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer’s edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients.

  6. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    Energy Technology Data Exchange (ETDEWEB)

    Bottaro, Marcio; Nagy, Balazs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral, E-mail: marcio@iee.usp.br [Universidade de Sao Paulo (USP), SP (Brazil); Optics and Engineering Informatics, Budapest University of Technology and Economics, Budapest (Hungary)

    2017-04-15

    Introduction: To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods: Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naive human observers were asked to mark the light field edge according to their own determination. Different interpretations of the contrast were then calculated and compared. Results: In contrast to automated measurements of edge definition and detection, measurements by human observers showed large inter-observer variation independent of their training with X-ray equipment. Different contrast calculations considering the different edge definitions gave very different contrast values. Conclusion: As the main conclusion, we propose a more exact edge definition of the X-ray light field, corresponding well to the average human observer's edge determination. The new edge definition method with automated systems would reduce human variability in edge determination. Such errors could potentially affect the approval of X-ray equipment, and also increase the radiation dose. The automated measurement based on human observers’ edge definition and the corresponding contrast calculation may lead to a more precise light field calibration, which enables reduced irradiation doses on radiology patients. (author)

  7. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  8. Hanford ground-water data base management guide and user's manual

    International Nuclear Information System (INIS)

    Mitchell, P.J.; Argo, R.S.; Bradymire, S.L.; Newbill, C.A.

    1985-05-01

    This management guide and user's manual is a working document for the computerized Hanford Ground-water Data Base maintained by the Geosciences Research and Engineering Department at Pacific Northwest Laboratory for the Hanford Ground-Water Surveillance Program. The program is managed by the Occupational and Environmental Protection Department for the US Department of Energy. The data base is maintained to provide rapid access to data that are rountinely collected from ground-water monitoring wells at the Hanford site. The data include water levels, sample analyses, geologic descriptions and well construction information of over 3000 existing or destroyed wells. These data are used to monitor water quality and for the evaluation of ground-water flow and pollutant transport problems. The management guide gives instructions for maintenance of the data base on the Digital Equipment Corporation PDP 11/70 Computer using the CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) data base management software developed at Pacific Northwest Laboratory. Maintenance activities include inserting, modifying and deleting data, making back-up copies of the data base, and generating tables for annual monitoring reports. The user's guide includes instructions for running programs to retrieve the data in the form of listings of graphical plots. 3 refs

  9. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    Science.gov (United States)

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  10. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  11. Principle and Design of a Single-phase Inverter-Based Grounding System for Neutral-to-ground Voltage Compensation in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, Lingjie; Zeng, Xiangjun

    2017-01-01

    Neutral-to-ground overvoltage may occur in non-effectively grounded power systems because of the distributed parameters asymmetry and resonance between Petersen coil and distributed capacitances. Thus, the constraint of neutral-to-ground voltage is critical for the safety of distribution networks....... In this paper, an active grounding system based on single-phase inverter and its control parameter design method is proposed to achieve this objective. Relationship between its output current and neutral-to-ground voltage is derived to explain the principle of neutral-to-ground voltage compensation. Then...

  12. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  13. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S [University of Nebraska Medical Center, Omaha, NE (United States)

    2016-06-15

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  14. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    International Nuclear Information System (INIS)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S

    2016-01-01

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm"2. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  15. Automation of the Work intensively based on Knowledge, a Challenge for the New Technologies

    Directory of Open Access Journals (Sweden)

    Vasile MAZILESCU

    2011-06-01

    Full Text Available Knowledge Management or knowledge-based management (noted and used throughout this paper as KM is defined as a collaborative practice, by which organizations deliberately and intelligibly create, organize, distribute and analyze their own knowledge, in terms of resources, documents and people’s skills. It is widely regarded as an internal tool for increasing the operational efficiency of any organization, and has the potential to revolutionize the intelligent interaction between humans and agents (intelligent, based on more and more advanced technology. Semantic Technologies (STs are distributed software technologies that make that meaning more explicit, principally so that it can be understood by computers. STs will dramatically impact enterprise architecture and the engineering of new system and infrastructure capabilities. They are tools that represent meanings, associations, theories, and know-how about the uses of things separately from data and knowledge, using reasoning algorithms. Time restrictions are not excessive in usual STs as distributed applications. Critical time reasoning problems may occur in case of faulty operations and overloading. At present, the reasoning depth developed for such system is still poor. This work represents research results for incorporating and considering appropriate semantic foundations in future technologies that can automate knowledge based work.

  16. Nanoparticle-based assays in automated flow systems: A review

    Energy Technology Data Exchange (ETDEWEB)

    Passos, Marieta L.C. [LAQV, REQUIMTE, Departamento de Ciências Químicas, Faculdade de Farmácia, Universidade do Porto, Rua Jorge Viterbo Ferreira, n° 228, 4050-313 Porto (Portugal); Pinto, Paula C.A.G., E-mail: ppinto@ff.up.pt [LAQV, REQUIMTE, Departamento de Ciências Químicas, Faculdade de Farmácia, Universidade do Porto, Rua Jorge Viterbo Ferreira, n° 228, 4050-313 Porto (Portugal); Santos, João L.M., E-mail: joaolms@ff.up.pt [LAQV, REQUIMTE, Departamento de Ciências Químicas, Faculdade de Farmácia, Universidade do Porto, Rua Jorge Viterbo Ferreira, n° 228, 4050-313 Porto (Portugal); Saraiva, M. Lúcia M.F.S., E-mail: lsaraiva@ff.up.pt [LAQV, REQUIMTE, Departamento de Ciências Químicas, Faculdade de Farmácia, Universidade do Porto, Rua Jorge Viterbo Ferreira, n° 228, 4050-313 Porto (Portugal); Araujo, André R.T.S. [LAQV, REQUIMTE, Departamento de Ciências Químicas, Faculdade de Farmácia, Universidade do Porto, Rua Jorge Viterbo Ferreira, n° 228, 4050-313 Porto (Portugal); Unidade de Investigação para o Desenvolvimento do Interior, Instituto Politécnico da Guarda, Av. Dr. Francisco de Sá Carneiro, n° 50, 6300-559 Guarda (Portugal)

    2015-08-19

    Nanoparticles (NPs) exhibit a number of distinctive and entrancing properties that explain their ever increasing application in analytical chemistry, mainly as chemosensors, signaling tags, catalysts, analytical signal enhancers, reactive species generators, analyte recognition and scavenging/separation entities. The prospect of associating NPs with automated flow-based analytical is undoubtedly a challenging perspective as it would permit confined, cost-effective and reliable analysis, within a shorter timeframe, while exploiting the features of NPs. This article aims at examining state-of-the-art on continuous flow analysis and microfluidic approaches involving NPs such as noble metals (gold and silver), magnetic materials, carbon, silica or quantum dots. Emphasis is devoted to NP format, main practical achievements and fields of application. In this context, the functionalization of NPs with distinct chemical species and ligands is debated in what concerns the motivations and strengths of developed approaches. The utilization of NPs to improve detector's performance in electrochemical application is out of the scope of this review. The works discussed in this review were published in the period of time comprised between the years 2000 and 2013. - Highlights: • The state of the art of flowing stream systems comprising NPs was reviewed. • The use of different types of nanoparticles in each flow technique is discussed. • The most expressive and profitable applications are summarized. • The main conclusions and future perspectives were compiled in the final section.

  17. Nanoparticle-based assays in automated flow systems: A review

    International Nuclear Information System (INIS)

    Passos, Marieta L.C.; Pinto, Paula C.A.G.; Santos, João L.M.; Saraiva, M. Lúcia M.F.S.; Araujo, André R.T.S.

    2015-01-01

    Nanoparticles (NPs) exhibit a number of distinctive and entrancing properties that explain their ever increasing application in analytical chemistry, mainly as chemosensors, signaling tags, catalysts, analytical signal enhancers, reactive species generators, analyte recognition and scavenging/separation entities. The prospect of associating NPs with automated flow-based analytical is undoubtedly a challenging perspective as it would permit confined, cost-effective and reliable analysis, within a shorter timeframe, while exploiting the features of NPs. This article aims at examining state-of-the-art on continuous flow analysis and microfluidic approaches involving NPs such as noble metals (gold and silver), magnetic materials, carbon, silica or quantum dots. Emphasis is devoted to NP format, main practical achievements and fields of application. In this context, the functionalization of NPs with distinct chemical species and ligands is debated in what concerns the motivations and strengths of developed approaches. The utilization of NPs to improve detector's performance in electrochemical application is out of the scope of this review. The works discussed in this review were published in the period of time comprised between the years 2000 and 2013. - Highlights: • The state of the art of flowing stream systems comprising NPs was reviewed. • The use of different types of nanoparticles in each flow technique is discussed. • The most expressive and profitable applications are summarized. • The main conclusions and future perspectives were compiled in the final section

  18. Comparison of manual versus automated data collection method for an evidence-based nursing practice study.

    Science.gov (United States)

    Byrne, M D; Jordan, T R; Welle, T

    2013-01-01

    The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 "false negative" patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare.

  19. A report on SHARP (Spacecraft Health Automated Reasoning Prototype) and the Voyager Neptune encounter

    Science.gov (United States)

    Martin, R. G. (Editor); Atkinson, D. J.; James, M. L.; Lawson, D. L.; Porta, H. J.

    1990-01-01

    The development and application of the Spacecraft Health Automated Reasoning Prototype (SHARP) for the operations of the telecommunications systems and link analysis functions in Voyager mission operations are presented. An overview is provided of the design and functional description of the SHARP system as it was applied to Voyager. Some of the current problems and motivations for automation in real-time mission operations are discussed, as are the specific solutions that SHARP provides. The application of SHARP to Voyager telecommunications had the goal of being a proof-of-capability demonstration of artificial intelligence as applied to the problem of real-time monitoring functions in planetary mission operations. AS part of achieving this central goal, the SHARP application effort was also required to address the issue of the design of an appropriate software system architecture for a ground-based, highly automated spacecraft monitoring system for mission operations, including methods for: (1) embedding a knowledge-based expert system for fault detection, isolation, and recovery within this architecture; (2) acquiring, managing, and fusing the multiple sources of information used by operations personnel; and (3) providing information-rich displays to human operators who need to exercise the capabilities of the automated system. In this regard, SHARP has provided an excellent example of how advanced artificial intelligence techniques can be smoothly integrated with a variety of conventionally programmed software modules, as well as guidance and solutions for many questions about automation in mission operations.

  20. Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW) data set measures atmospheric water vapor using ground-based...

  1. Quantitative grading of store separation trajectories

    CSIR Research Space (South Africa)

    Jamison, Kevin A

    2017-09-01

    Full Text Available . This paper describes the development of an automated analysis process and software that can run a multitude of separation scenarios. A key enabler for this software is the development of a quantitative grading algorithm that scores the outcome of each release...

  2. Automated Functional Testing based on the Navigation of Web Applications

    Directory of Open Access Journals (Sweden)

    Boni García

    2011-08-01

    Full Text Available Web applications are becoming more and more complex. Testing such applications is an intricate hard and time-consuming activity. Therefore, testing is often poorly performed or skipped by practitioners. Test automation can help to avoid this situation. Hence, this paper presents a novel approach to perform automated software testing for web applications based on its navigation. On the one hand, web navigation is the process of traversing a web application using a browser. On the other hand, functional requirements are actions that an application must do. Therefore, the evaluation of the correct navigation of web applications results in the assessment of the specified functional requirements. The proposed method to perform the automation is done in four levels: test case generation, test data derivation, test case execution, and test case reporting. This method is driven by three kinds of inputs: i UML models; ii Selenium scripts; iii XML files. We have implemented our approach in an open-source testing framework named Automatic Testing Platform. The validation of this work has been carried out by means of a case study, in which the target is a real invoice management system developed using a model-driven approach.

  3. Development and evaluation of a profile negotiation process for integrating aircraft and air traffic control automation

    Science.gov (United States)

    Green, Steven M.; Denbraven, Wim; Williams, David H.

    1993-01-01

    The development and evaluation of the profile negotiation process (PNP), an interactive process between an aircraft and air traffic control (ATC) that integrates airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible, are described. The PNP was evaluated in a real-time simulation experiment conducted jointly by NASA's Ames and Langley Research Centers. The Ames Center/TRACON Automation System (CTAS) was used to support the ATC environment, and the Langley Transport Systems Research Vehicle (TSRV) piloted cab was used to simulate a 4D Flight Management System (FMS) capable aircraft. Both systems were connected in real time by way of voice and data lines; digital datalink communications capability was developed and evaluated as a means of supporting the air/ground exchange of trajectory data. The controllers were able to consistently and effectively negotiate nominally conflict-free vertical profiles with the 4D-equipped aircraft. The actual profiles flown were substantially closer to the aircraft's preference than would have been possible without the PNP. However, there was a strong consensus among the pilots and controllers that the level of automation of the PNP should be increased to make the process more transparent. The experiment demonstrated the importance of an aircraft's ability to accurately execute a negotiated profile as well as the need for digital datalink to support advanced air/ground data communications. The concept of trajectory space is proposed as a comprehensive approach for coupling the processes of trajectory planning and tracking to allow maximum pilot discretion in meeting ATC constraints.

  4. Automating quantum experiment control

    Science.gov (United States)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  5. Automated detection of a prostate Ni-Ti stent in electronic portal images.

    Science.gov (United States)

    Carl, Jesper; Nielsen, Henning; Nielsen, Jane; Lund, Bente; Larsen, Erik Hoejkjaer

    2006-12-01

    Planning target volumes (PTV) in fractionated radiotherapy still have to be outlined with wide margins to the clinical target volume due to uncertainties arising from daily shift of the prostate position. A recently proposed new method of visualization of the prostate is based on insertion of a thermo-expandable Ni-Ti stent. The current study proposes a new detection algorithm for automated detection of the Ni-Ti stent in electronic portal images. The algorithm is based on the Ni-Ti stent having a cylindrical shape with a fixed diameter, which was used as the basis for an automated detection algorithm. The automated method uses enhancement of lines combined with a grayscale morphology operation that looks for enhanced pixels separated with a distance similar to the diameter of the stent. The images in this study are all from prostate cancer patients treated with radiotherapy in a previous study. Images of a stent inserted in a humanoid phantom demonstrated a localization accuracy of 0.4-0.7 mm which equals the pixel size in the image. The automated detection of the stent was compared to manual detection in 71 pairs of orthogonal images taken in nine patients. The algorithm was successful in 67 of 71 pairs of images. The method is fast, has a high success rate, good accuracy, and has a potential for unsupervised localization of the prostate before radiotherapy, which would enable automated repositioning before treatment and allow for the use of very tight PTV margins.

  6. Automated detection of a prostate Ni-Ti stent in electronic portal images

    International Nuclear Information System (INIS)

    Carl, Jesper; Nielsen, Henning; Nielsen, Jane; Lund, Bente; Larsen, Erik Hoejkjaer

    2006-01-01

    Planning target volumes (PTV) in fractionated radiotherapy still have to be outlined with wide margins to the clinical target volume due to uncertainties arising from daily shift of the prostate position. A recently proposed new method of visualization of the prostate is based on insertion of a thermo-expandable Ni-Ti stent. The current study proposes a new detection algorithm for automated detection of the Ni-Ti stent in electronic portal images. The algorithm is based on the Ni-Ti stent having a cylindrical shape with a fixed diameter, which was used as the basis for an automated detection algorithm. The automated method uses enhancement of lines combined with a grayscale morphology operation that looks for enhanced pixels separated with a distance similar to the diameter of the stent. The images in this study are all from prostate cancer patients treated with radiotherapy in a previous study. Images of a stent inserted in a humanoid phantom demonstrated a localization accuracy of 0.4-0.7 mm which equals the pixel size in the image. The automated detection of the stent was compared to manual detection in 71 pairs of orthogonal images taken in nine patients. The algorithm was successful in 67 of 71 pairs of images. The method is fast, has a high success rate, good accuracy, and has a potential for unsupervised localization of the prostate before radiotherapy, which would enable automated repositioning before treatment and allow for the use of very tight PTV margins

  7. The NASA automation and robotics technology program

    Science.gov (United States)

    Holcomb, Lee B.; Montemerlo, Melvin D.

    1986-01-01

    The development and objectives of the NASA automation and robotics technology program are reviewed. The objectives of the program are to utilize AI and robotics to increase the probability of mission success; decrease the cost of ground control; and increase the capability and flexibility of space operations. There is a need for real-time computational capability; an effective man-machine interface; and techniques to validate automated systems. Current programs in the areas of sensing and perception, task planning and reasoning, control execution, operator interface, and system architecture and integration are described. Programs aimed at demonstrating the capabilities of telerobotics and system autonomy are discussed.

  8. Separation by electrostatic equipments; Separacion por medios electrostaticos

    Energy Technology Data Exchange (ETDEWEB)

    Miguel, R.; Larrauri, E.; Arnaiz, S.; Cacho, S.; Robertson, C.; Smallwood, J.; Coilt, J.; Ufer, R.; Kohnlecher, R.

    2000-07-01

    Development of automated separation technologies is essential in increasing recovery rates, particularly from highly mixed sources such municipal solid wastes and wastes from electric and electronic equipment, and in reducing recycling costs. This frame moved GAIKER Technological Centre to look for new technologies that allow to recover materials such metals, plastics, papers from those waste sources. Electrostatic separation technology has been successfully applied to separate these materials collaborating to get the targets specified by legislation. (Author)

  9. Towards cooperative guidance and control of highly automated vehicles: H-Mode and Conduct-by-Wire.

    Science.gov (United States)

    Flemisch, Frank Ole; Bengler, Klaus; Bubb, Heiner; Winner, Hermann; Bruder, Ralph

    2014-01-01

    This article provides a general ergonomic framework of cooperative guidance and control for vehicles with an emphasis on the cooperation between a human and a highly automated vehicle. In the twenty-first century, mobility and automation technologies are increasingly fused. In the sky, highly automated aircraft are flying with a high safety record. On the ground, a variety of driver assistance systems are being developed, and highly automated vehicles with increasingly autonomous capabilities are becoming possible. Human-centred automation has paved the way for a better cooperation between automation and humans. How can these highly automated systems be structured so that they can be easily understood, how will they cooperate with the human? The presented research was conducted using the methods of iterative build-up and refinement of framework by triangulation, i.e. by instantiating and testing the framework with at least two derived concepts and prototypes. This article sketches a general, conceptual ergonomic framework of cooperative guidance and control of highly automated vehicles, two concepts derived from the framework, prototypes and pilot data. Cooperation is exemplified in a list of aspects and related to levels of the driving task. With the concept 'Conduct-by-Wire', cooperation happens mainly on the guidance level, where the driver can delegate manoeuvres to the automation with a specialised manoeuvre interface. With H-Mode, a haptic-multimodal interaction with highly automated vehicles based on the H(orse)-Metaphor, cooperation is mainly done on guidance and control with a haptically active interface. Cooperativeness should be a key aspect for future human-automation systems. Especially for highly automated vehicles, cooperative guidance and control is a research direction with already promising concepts and prototypes that should be further explored. The application of the presented approach is every human-machine system that moves and includes high

  10. Simulation of Ground-Water Flow and Effects of Ground-Water Irrigation on Base Flow in the Elkhorn and Loup River Basins, Nebraska

    Science.gov (United States)

    Peterson, Steven M.; Stanton, Jennifer S.; Saunders, Amanda T.; Bradley, Jesse R.

    2008-01-01

    Irrigated agriculture is vital to the livelihood of communities in the Elkhorn and Loup River Basins in Nebraska, and ground water is used to irrigate most of the cropland. Concerns about the sustainability of ground-water and surface-water resources have prompted State and regional agencies to evaluate the cumulative effects of ground-water irrigation in this area. To facilitate understanding of the effects of ground-water irrigation, a numerical computer model was developed to simulate ground-water flow and assess the effects of ground-water irrigation (including ground-water withdrawals, hereinafter referred to as pumpage, and enhanced recharge) on stream base flow. The study area covers approximately 30,800 square miles, and includes the Elkhorn River Basin upstream from Norfolk, Nebraska, and the Loup River Basin upstream from Columbus, Nebraska. The water-table aquifer consists of Quaternary-age sands and gravels and Tertiary-age silts, sands, and gravels. The simulation was constructed using one layer with 2-mile by 2-mile cell size. Simulations were constructed to represent the ground-water system before 1940 and from 1940 through 2005, and to simulate hypothetical conditions from 2006 through 2045 or 2055. The first simulation represents steady-state conditions of the system before anthropogenic effects, and then simulates the effects of early surface-water development activities and recharge of water leaking from canals during 1895 to 1940. The first simulation ends at 1940 because before that time, very little pumpage for irrigation occurred, but after that time it became increasingly commonplace. The pre-1940 simulation was calibrated against measured water levels and estimated long-term base flow, and the 1940 through 2005 simulation was calibrated against measured water-level changes and estimated long-term base flow. The calibrated 1940 through 2005 simulation was used as the basis for analyzing hypothetical scenarios to evaluate the effects of

  11. Test-retest reliability of automated whole body and compartmental muscle volume measurements on a wide bore 3T MR system.

    Science.gov (United States)

    Thomas, Marianna S; Newman, David; Leinhard, Olof Dahlqvist; Kasmai, Bahman; Greenwood, Richard; Malcolm, Paul N; Karlsson, Anette; Rosander, Johannes; Borga, Magnus; Toms, Andoni P

    2014-09-01

    To measure the test-retest reproducibility of an automated system for quantifying whole body and compartmental muscle volumes using wide bore 3 T MRI. Thirty volunteers stratified by body mass index underwent whole body 3 T MRI, two-point Dixon sequences, on two separate occasions. Water-fat separation was performed, with automated segmentation of whole body, torso, upper and lower leg volumes, and manually segmented lower leg muscle volumes. Mean automated total body muscle volume was 19·32 L (SD9·1) and 19·28 L (SD9·12) for first and second acquisitions (Intraclass correlation coefficient (ICC) = 1·0, 95% level of agreement -0·32-0·2 L). ICC for all automated test-retest muscle volumes were almost perfect (0·99-1·0) with 95% levels of agreement 1.8-6.6% of mean volume. Automated muscle volume measurements correlate closely with manual quantification (right lower leg: manual 1·68 L (2SD0·6) compared to automated 1·64 L (2SD 0·6), left lower leg: manual 1·69 L (2SD 0·64) compared to automated 1·63 L (SD0·61), correlation coefficients for automated and manual segmentation were 0·94-0·96). Fully automated whole body and compartmental muscle volume quantification can be achieved rapidly on a 3 T wide bore system with very low margins of error, excellent test-retest reliability and excellent correlation to manual segmentation in the lower leg. Sarcopaenia is an important reversible complication of a number of diseases. Manual quantification of muscle volume is time-consuming and expensive. Muscles can be imaged using in and out of phase MRI. Automated atlas-based segmentation can identify muscle groups. Automated muscle volume segmentation is reproducible and can replace manual measurements.

  12. NASA space station automation: AI-based technology review. Executive summary

    Science.gov (United States)

    Firschein, O.; Georgeff, M. P.; Park, W.; Cheeseman, P. C.; Goldberg, J.; Neumann, P.; Kautz, W. H.; Levitt, K. N.; Rom, R. J.; Poggio, A. A.

    1985-01-01

    Research and Development projects in automation technology for the Space Station are described. Artificial Intelligence (AI) based technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics.

  13. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    International Nuclear Information System (INIS)

    Casey, Leslie A.

    2014-01-01

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  14. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Leslie A.

    2014-01-13

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  15. The COROT ground-based archive and access system

    Science.gov (United States)

    Solano, E.; González-Riestra, R.; Catala, C.; Baglin, A.

    2002-01-01

    A prototype of the COROT ground-based archive and access system is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

  16. A detrimental soil disturbance prediction model for ground-based timber harvesting

    Science.gov (United States)

    Derrick A. Reeves; Matthew C. Reeves; Ann M. Abbott; Deborah S. Page-Dumroese; Mark D. Coleman

    2012-01-01

    Soil properties and forest productivity can be affected during ground-based harvest operations and site preparation. The degree of impact varies widely depending on topographic features and soil properties. Forest managers who understand site-specific limits to ground-based harvesting can alter harvest method or season to limit soil disturbance. To determine the...

  17. Design and Evaluation of Nextgen Aircraft Separation Assurance Concepts

    Science.gov (United States)

    Johnson, Walter; Ho, Nhut; Arutyunov, Vladimir; Laue, John-Luke; Wilmoth, Ian

    2012-01-01

    To support the development and evaluation of future function allocation concepts for separation assurance systems for the Next Generation Air Transportation System, this paper presents the design and human-in-the-loop evaluation of three feasible function allocation concepts that allocate primary aircraft separation assurance responsibilities and workload to: 1) pilots; 2) air traffic controllers (ATC); and 3) automation. The design of these concepts also included rules of the road, separation assurance burdens for aircraft of different equipage levels, and utilization of advanced weather displays paired with advanced conflict detection and resolution automation. Results of the human-in-the-loop simulation show that: a) all the concepts are robust with respect to weather perturbation; b) concept 1 (pilots) had highest throughput, closest to assigned spacing, and fewest violations of speed and altitude restrictions; c) the energy of the aircraft during the descent phase was better managed in concepts 1 and 2 (pilots and ATC) than in concept 3 (automation), in which the situation awareness of pilots and controllers was lowest, and workload of pilots was highest. The paper also discusses further development of these concepts and their augmentation and integration with future air traffic management tools and systems that are being considered for NextGen.

  18. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  19. Shockwave-Based Automated Vehicle Longitudinal Control Algorithm for Nonrecurrent Congestion Mitigation

    Directory of Open Access Journals (Sweden)

    Liuhui Zhao

    2017-01-01

    Full Text Available A shockwave-based speed harmonization algorithm for the longitudinal movement of automated vehicles is presented in this paper. In the advent of Connected/Automated Vehicle (C/AV environment, the proposed algorithm can be applied to capture instantaneous shockwaves constructed from vehicular speed profiles shared by individual equipped vehicles. With a continuous wavelet transform (CWT method, the algorithm detects abnormal speed drops in real-time and optimizes speed to prevent the shockwave propagating to the upstream traffic. A traffic simulation model is calibrated to evaluate the applicability and efficiency of the proposed algorithm. Based on 100% C/AV market penetration, the simulation results show that the CWT-based algorithm accurately detects abnormal speed drops. With the improved accuracy of abnormal speed drop detection, the simulation results also demonstrate that the congestion can be mitigated by reducing travel time and delay up to approximately 9% and 18%, respectively. It is also found that the shockwave caused by nonrecurrent congestion is quickly dissipated even with low market penetration.

  20. ARCA II - a new apparatus for fast, repetitive HPLC separations

    International Nuclear Information System (INIS)

    Schaedel, M.; Bruechle, W.; Jaeger, E.; Schimpf, E.; Kratz, J.V.; Scherer, U.W.; Zimmermann, H.P.

    1989-04-01

    The microcomputer controlled Automated Rapid Chemistry Apparatus, ARCA, is described in its newly designed version for the study of chemical properties of element 105 in aqueous solutions. This improved version, ARCA II, is adapted to the needs of fast and repetitive separations to be carried out in a chemically inert automated micro high performance liquid chromatography system. As an example, the separation of several group IIIB, IVB, and VB elements in the system triisooctylamine/hydrochloric acid within 30 s is demonstrated. Furthermore, a new method for the fast preparation of samples for α-particle spectroscopy by evaporation of the aqueous effluent with an intense light source is presented. (orig.)

  1. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  2. Automation and Robotics for Space-Based Systems, 1991

    Science.gov (United States)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  3. FluReF, an automated flu virus reassortment finder based on phylogenetic trees.

    Science.gov (United States)

    Yurovsky, Alisa; Moret, Bernard M E

    2011-01-01

    Reassortments are events in the evolution of the genome of influenza (flu), whereby segments of the genome are exchanged between different strains. As reassortments have been implicated in major human pandemics of the last century, their identification has become a health priority. While such identification can be done "by hand" on a small dataset, researchers and health authorities are building up enormous databases of genomic sequences for every flu strain, so that it is imperative to develop automated identification methods. However, current methods are limited to pairwise segment comparisons. We present FluReF, a fully automated flu virus reassortment finder. FluReF is inspired by the visual approach to reassortment identification and uses the reconstructed phylogenetic trees of the individual segments and of the full genome. We also present a simple flu evolution simulator, based on the current, source-sink, hypothesis for flu cycles. On synthetic datasets produced by our simulator, FluReF, tuned for a 0% false positive rate, yielded false negative rates of less than 10%. FluReF corroborated two new reassortments identified by visual analysis of 75 Human H3N2 New York flu strains from 2005-2008 and gave partial verification of reassortments found using another bioinformatics method. FluReF finds reassortments by a bottom-up search of the full-genome and segment-based phylogenetic trees for candidate clades--groups of one or more sampled viruses that are separated from the other variants from the same season. Candidate clades in each tree are tested to guarantee confidence values, using the lengths of key edges as well as other tree parameters; clades with reassortments must have validated incongruencies among segment trees. FluReF demonstrates robustness of prediction for geographically and temporally expanded datasets, and is not limited to finding reassortments with previously collected sequences. The complete source code is available from http://lcbb.epfl.ch/software.html.

  4. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  5. Modern business process automation : YAWL and its support environment

    NARCIS (Netherlands)

    Hofstede, ter A.H.M.; Aalst, van der W.M.P.; Adams, M.; Russell, N.C.

    2010-01-01

    This book provides a comprehensive treatment of the field of Business Process Management (BPM) with a focus on Business Process Automation. It achieves this by covering a wide range of topics, both introductory and advanced, illustrated through and grounded in the YAWL (Yet Another Workflow

  6. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  7. The Earth Observing System (EOS) Ground System: Leveraging an Existing Operational Ground System Infrastructure to Support New Missions

    Science.gov (United States)

    Hardison, David; Medina, Johnny; Dell, Greg

    2016-01-01

    The Earth Observer System (EOS) was officially established in 1990 and went operational in December 1999 with the launch of its flagship spacecraft Terra. Aqua followed in 2002 and Aura in 2004. All three spacecraft are still operational and producing valuable scientific data. While all are beyond their original design lifetime, they are expected to remain viable well into the 2020s. The EOS Ground System is a multi-mission system based at NASA Goddard Space Flight Center that supports science and spacecraft operations for these three missions. Over its operational lifetime to date, the EOS Ground System has evolved as needed to accommodate mission requirements. With an eye towards the future, several updates are currently being deployed. Subsystem interconnects are being upgraded to reduce data latency and improve system performance. End-of-life hardware and operating systems are being replaced to mitigate security concerns and eliminate vendor support gaps. Subsystem hardware is being consolidated through the migration to Virtual Machine based platforms. While mission operations autonomy was not a design goal of the original system concept, there is an active effort to apply state-of-the-art products from the Goddard Mission Services Evolution Center (GMSEC) to facilitate automation where possible within the existing heritage architecture. This presentation will provide background information on the EOS ground system architecture and evolution, discuss latest improvements, and conclude with the results of a recent effort that investigated how the current system could accommodate a proposed new earth science mission.

  8. An Extended Case Study Methoology for Investigating Influence of Cultural, Organizational, and Automation Factors on Human-Automation Trust

    Science.gov (United States)

    Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  9. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  10. Intelligent, Semi-Automated Procedure Aid (ISAPA) for ISS Flight Control, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop the Intelligent, Semi-Automated Procedure Aid (ISAPA) intended for use by International Space Station (ISS) ground controllers to increase the...

  11. Ground robotic measurement of aeolian processes

    Science.gov (United States)

    Qian, Feifei; Jerolmack, Douglas; Lancaster, Nicholas; Nikolich, George; Reverdy, Paul; Roberts, Sonia; Shipley, Thomas; Van Pelt, R. Scott; Zobeck, Ted M.; Koditschek, Daniel E.

    2017-08-01

    Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These devices are often cumbersome and logistically difficult to set up and maintain, especially near steep or vegetated dune surfaces. Significant advances in instrumentation are needed to provide the datasets that are required to validate and improve mechanistic models of aeolian sediment transport. Recent advances in robotics show great promise for assisting and amplifying scientists' efforts to increase the spatial and temporal resolution of many environmental measurements governing sediment transport. The emergence of cheap, agile, human-scale robotic platforms endowed with increasingly sophisticated sensor and motor suites opens up the prospect of deploying programmable, reactive sensor payloads across complex terrain in the service of aeolian science. This paper surveys the need and assesses the opportunities and challenges for amassing novel, highly resolved spatiotemporal datasets for aeolian research using partially-automated ground mobility. We review the limitations of existing measurement approaches for aeolian processes, and discuss how they may be transformed by ground-based robotic platforms, using examples from our initial field experiments. We then review how the need to traverse challenging aeolian terrains and simultaneously make high-resolution measurements of critical variables requires enhanced robotic capability. Finally, we conclude with a look to the future, in which robotic platforms may operate with increasing autonomy in harsh conditions. Besides expanding the completeness of terrestrial datasets, bringing ground-based robots to the aeolian research community may lead to unexpected discoveries that generate new hypotheses to expand the science

  12. KSC ADVANCED GROUND BASED FIELD MILL V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Ground Based Field Mill (AGBFM) network consists of 34 (31 operational) field mills located at Kennedy Space Center (KSC), Florida. The field mills...

  13. Ionic-Liquid Based Separation of Azeotropic Mixtures

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2014-01-01

    methodology for the screening of ionic liquids (ILs) as entrainers for ILs-based separation processes in binary aqueous azeotropic systems (e.g., water + ethanol and water + isopropanol) is presented. Ionic liquids as entrainers were first screened based on a combination of criteria such as stabi......methodology for the screening of ionic liquids (ILs) as entrainers for ILs-based separation processes in binary aqueous azeotropic systems (e.g., water + ethanol and water + isopropanol) is presented. Ionic liquids as entrainers were first screened based on a combination of criteria...... [C1MIM][DMP]. For the final evaluation, the best candidates for aqueous systems were used as entrainers, and then the vapor-liquid equilibrium (VLE) of the ternary systems containing ILs was predicted by the Non Random Two Liquids (NRTL) model to confirm the breaking of the azeotrope. Based...... on minimum concentration of the ILs required to break the given azeotrope, the best ILs as entrainers for water + ethanol and water + isopropanol azeotropic mixtures were [C1MIM][DMP] and [C2MIM][N(CN)2], respectively....

  14. Life Sciences Research Facility automation requirements and concepts for the Space Station

    Science.gov (United States)

    Rasmussen, Daryl N.

    1986-01-01

    An evaluation is made of the methods and preliminary results of a study on prospects for the automation of the NASA Space Station's Life Sciences Research Facility. In order to remain within current Space Station resource allocations, approximately 85 percent of planned life science experiment tasks must be automated; these tasks encompass specimen care and feeding, cage and instrument cleaning, data acquisition and control, sample analysis, waste management, instrument calibration, materials inventory and management, and janitorial work. Task automation will free crews for specimen manipulation, tissue sampling, data interpretation and communication with ground controllers, and experiment management.

  15. Metal–organic frameworks based membranes for liquid separation

    KAUST Repository

    Li, Xin; Liu, Yuxin; Wang, Jing; Gascon, Jorge; Li, Jiansheng; Van der Bruggen, Bart

    2017-01-01

    , the field of MOF-based membranes for liquid separation is highlighted in this review. The criteria for judicious selection of MOFs in fabricating MOF-based membranes are given. Special attention is paid to rational design strategies for MOF-based membranes

  16. Proceedings of the international conference on advancements in automation, robotics and sensing: souvenir

    International Nuclear Information System (INIS)

    Vinod, B.; Sundaram, M.; Sujatha, K.S.; Brislin, J. Joe; Prabhakarab, S.

    2016-01-01

    Robotics and automation is a thriving domain in the field of engineering, comprising of major areas like electrical, electronics, mechanical, automation, computer and robotics engineering. This conference address issues related to technical advances in all these fields. Papers relevant to INIS are indexed separately

  17. Study radiolabeling of urea-based PSMA inhibitor with 68-Galliu: Comparative evaluation of automated and not automated methods

    International Nuclear Information System (INIS)

    Alcarde, Lais Fernanda

    2016-01-01

    The methods for clinical diagnosis of prostate cancer include rectal examination and the dosage of the prostatic specific antigen (PSA). However, the PSA level is elevated in about 20 to 30% of cases related to benign pathologies, resulting in false positives and leading patients to unnecessary biopsies. The prostate specific membrane antigen (PSMA), in contrast, is over expressed in prostate cancer and founded at low levels in healthy organs. As a result, it stimulated the development of small molecule inhibitors of PSMA, which carry imaging agents to the tumor and are not affected by their microvasculature. Recent studies suggest that the HBED-CC chelator intrinsically contributes to the binding of the PSMA inhibitor peptide based on urea (Glu-urea-Lys) to the pharmacophore group. This work describes the optimization of radiolabeling conditions of PSMA-HBED-CC with "6"8Ga, using automated system (synthesis module) and no automated method, seeking to establish an appropriate condition to prepare this new radiopharmaceutical, with emphasis on the labeling yield and radiochemical purity of the product. It also aimed to evaluate the stability of the radiolabeled peptide in transport conditions and study the biological distribution of the radiopharmaceutical in healthy mice. The study of radiolabeling parameters enabled to define a non-automated method which resulted in high radiochemical purity (> 95 %) without the need for purification of the labeled peptide. The automated method has been adapted, using a module of synthesis and software already available at IPEN, and also resulted in high synthetic yield (≥ 90%) specially when compared with those described in the literature, with the associated benefit of greater control of the production process in compliance with Good Manufacturing Practices. The study of radiolabeling parameters afforded the PSMA-HBED-CC-"6"8Ga with higher specific activity than observed in published clinical studies (≥ 140,0 GBq

  18. Rapid automated nuclear chemistry

    International Nuclear Information System (INIS)

    Meyer, R.A.

    1979-01-01

    Rapid Automated Nuclear Chemistry (RANC) can be thought of as the Z-separation of Neutron-rich Isotopes by Automated Methods. The range of RANC studies of fission and its products is large. In a sense, the studies can be categorized into various energy ranges from the highest where the fission process and particle emission are considered, to low energies where nuclear dynamics are being explored. This paper presents a table which gives examples of current research using RANC on fission and fission products. The remainder of this text is divided into three parts. The first contains a discussion of the chemical methods available for the fission product elements, the second describes the major techniques, and in the last section, examples of recent results are discussed as illustrations of the use of RANC

  19. A pattern-based method to automate mask inspection files

    Science.gov (United States)

    Kamal Baharin, Ezni Aznida Binti; Muhsain, Mohamad Fahmi Bin; Ahmad Ibrahim, Muhamad Asraf Bin; Ahmad Noorhani, Ahmad Nurul Ihsan Bin; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe

    2017-03-01

    Mask inspection is a critical step in the mask manufacturing process in order to ensure all dimensions printed are within the needed tolerances. This becomes even more challenging as the device nodes shrink and the complexity of the tapeout increases. Thus, the amount of measurement points and their critical dimension (CD) types are increasing to ensure the quality of the mask. In addition to the mask quality, there is a significant amount of manpower needed when the preparation and debugging of this process are not automated. By utilizing a novel pattern search technology with the ability to measure and report match region scan-line (edge) measurements, we can create a flow to find, measure and mark all metrology locations of interest and provide this automated report to the mask shop for inspection. A digital library is created based on the technology product and node which contains the test patterns to be measured. This paper will discuss how these digital libraries will be generated and then utilized. As a time-critical part of the manufacturing process, this can also reduce the data preparation cycle time, minimize the amount of manual/human error in naming and measuring the various locations, reduce the risk of wrong/missing CD locations, and reduce the amount of manpower needed overall. We will also review an example pattern and how the reporting structure to the mask shop can be processed. This entire process can now be fully automated.

  20. Sb(III) and Sb(V) separation and analytical speciation by a continuous tandem on-line separation device in connection with inductively coupled plasma atomic emission spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Menendez Garcia, A. [Oviedo Univ. (Spain). Dept. of Phys. and Anal. Chem.; Perez Rodriguez, M.C. [Oviedo Univ. (Spain). Dept. of Phys. and Anal. Chem.; Sanchez Uria, J.F. [Oviedo Univ. (Spain). Dept. of Phys. and Anal. Chem.; Sanz-Medel, A. [Oviedo Univ. (Spain). Dept. of Phys. and Anal. Chem.

    1995-09-01

    A sensitive, precise and automated non-chromatographic method for Sb(III) and Sb(V) analytical speciation based on a continuous tandem on-line separation device in connection with inductively coupled plasma-atomic emission (ICP-AES) detection is proposed. Two on-line successive separation steps are included into this method: a continuous liquid-liquid extraction of Sb(III) with ammonium pyrrolidine dithiocarbamate (APDC) into methylisobuthylketone (MIBK), followed by direct stibine generation from the organic phase. Both separation steps are carried out in a continuous mode and on-line with the ICP-AES detector. Optimization of experimental conditions for the tandem separation and ICP-AES detection are investigated in detail. Detection limits for Sb(III) were 3 ng.mL{sup -1} and for Sb(V) 8 ng.mL{sup -1}. Precisions observed are in the range {+-} 5%. The proposed methodology has been applied to Sb(III) and Sb(V) speciation in sea-water samples. (orig.)

  1. Automation and Robotics for space operation and planetary exploration

    Science.gov (United States)

    Montemerlo, Melvin D.

    1990-01-01

    This paper presents a perspective of Automation and Robotics (A&R) research and developments at NASA in terms of its history, its current status, and its future. It covers artificial intelligence, telerobotics and planetary rovers, and it encompasses ground operations, operations in earth orbit, and planetary exploration.

  2. Biomass burning aerosols characterization from ground based and profiling measurements

    Science.gov (United States)

    Marin, Cristina; Vasilescu, Jeni; Marmureanu, Luminita; Ene, Dragos; Preda, Liliana; Mihailescu, Mona

    2018-04-01

    The study goal is to assess the chemical and optical properties of aerosols present in the lofted layers and at the ground. The biomass burning aerosols were evaluated in low level layers from multi-wavelength lidar measurements, while chemical composition at ground was assessed using an Aerosol Chemical Speciation Monitor (ACSM) and an Aethalometer. Classification of aerosol type and specific organic markers were used to explore the potential to sense the particles from the same origin at ground base and on profiles.

  3. Control Method of Single-phase Inverter Based Grounding System in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, L.; Zeng, X.

    2016-01-01

    of neutral-to-ground voltage is critical for the safety of distribution networks. An active grounding system based on single-phase inverter is proposed to achieve this objective. Relationship between output current of the system and neutral-to-ground voltage is derived to explain the principle of neutral......The asymmetry of the inherent distributed capacitances causes the rise of neutral-to-ground voltage in ungrounded system or high resistance grounded system. Overvoltage may occur in resonant grounded system if Petersen coil is resonant with the distributed capacitances. Thus, the restraint...

  4. Feasibility of geometric-intensity-based semi-automated delineation of the tentorium cerebelli from MRI scans.

    Science.gov (United States)

    Penumetcha, Neeraja; Kabadi, Suraj; Jedynak, Bruno; Walcutt, Charles; Gado, Mokhtar H; Wang, Lei; Ratnanather, J Tilak

    2011-04-01

    This paper describes a feasibility study of a method for delineating the tentorium cerebelli in magnetic resonance imaging (MRI) brain scans. The tentorium cerebelli is a thin sheet of dura matter covering the cerebellum and separating it from the posterior part of the temporal lobe and the occipital lobe of the cerebral hemispheres. Cortical structures such as the parahippocampal gyrus can be indistinguishable from tentorium in magnetized prepared rapid gradient echo and T1-weighted MRI scans. Similar intensities in these neighboring regions make it difficult to perform accurate cortical analysis in neuroimaging studies of schizophrenia and Alzheimer's disease. A semi-automated, geometric, intensity-based procedure for delineating the tentorium from a whole-brain scan is described. Initial and final curves are traced within the tentorium. A cost function, based on intensity and Euclidean distance, is computed between the two curves using the Fast Marching method. The initial curve is then evolved to the final curve based on the gradient of the computed costs, generating a series of intermediate curves. These curves are then used to generate a triangulated surface of the tentorium. For 3 scans, surfaces were found to be within 2 voxels from hand segmentations. Copyright © 2009 by the American Society of Neuroimaging.

  5. Ground and satellite-based remote sensing of mineral dust using AERI spectra and MODIS thermal infrared window brightness temperatures

    Science.gov (United States)

    Hansell, Richard Allen, Jr.

    The radiative effects of dust aerosol on our climate system have yet to be fully understood and remain a topic of contemporary research. To investigate these effects, detection/retrieval methods for dust events over major dust outbreak and transport areas have been developed using satellite and ground-based approaches. To this end, both the shortwave and longwave surface radiative forcing of dust aerosol were investigated. The ground-based remote sensing approach uses the Atmospheric Emitted Radiance Interferometer brightness temperature spectra to detect mineral dust events and to retrieve their properties. Taking advantage of the high spectral resolution of the AERI instrument, absorptive differences in prescribed thermal IR window sub-band channels were exploited to differentiate dust from cirrus clouds. AERI data collected during the UAE2 at Al-Ain UAE was employed for dust retrieval. Assuming a specified dust composition model a priori and using the light scattering programs of T-matrix and the finite difference time domain methods for oblate spheroids and hexagonal plates, respectively, dust optical depths have been retrieved and compared to those inferred from a collocated and coincident AERONET sun-photometer dataset. The retrieved optical depths were then used to determine the dust longwave surface forcing during the UAE2. Likewise, dust shortwave surface forcing is investigated employing a differential technique from previous field studies. The satellite-based approach uses MODIS thermal infrared brightness temperature window data for the simultaneous detection/separation of mineral dust and cirrus clouds. Based on the spectral variability of dust emissivity at the 3.75, 8.6, 11 and 12 mum wavelengths, the D*-parameter, BTD-slope and BTD3-11 tests are combined to identify dust and cirrus. MODIS data for the three dust-laden scenes have been analyzed to demonstrate the effectiveness of this detection/separation method. Detected daytime dust and cloud

  6. Assessment of parameters of gas centrifuge and separation cascade basing on integral characteristics of separation plant

    Energy Technology Data Exchange (ETDEWEB)

    Borisevich, Valentin, E-mail: VDBorisevich@mephi.ru [National Research Nuclear University MEPhI, Kashirskoe Shosse 31, Moscow 115409 (Russian Federation); National Research Center “Kurchatov Institute”, Kurchatov Square 1, Moscow 123182 (Russian Federation); Borshchevskiy, Michael, E-mail: Michael_mephi@mail.ru [National Research Nuclear University MEPhI, Kashirskoe Shosse 31, Moscow 115409 (Russian Federation); Andronov, Igor, E-mail: andronov@imp.kiae.ru [National Research Center “Kurchatov Institute”, Kurchatov Square 1, Moscow 123182 (Russian Federation); Senchenkov, Sergey, E-mail: senchenkov@imp.kiae.ru [National Research Center “Kurchatov Institute”, Kurchatov Square 1, Moscow 123182 (Russian Federation)

    2013-12-15

    Highlights: • We developed the calculation method to assess a feed flow rate into a gas centrifuge. • It is based on the knowledge of the integral characteristics of a separation plant. • Our method is verified by comparison with the results of the independent one. • The method also allows to specify other features of the separation cascade work. - Abstract: A calculation technique to assess a feed flow rate into a single GC, a total number of centrifuges in a separation cascade and to determine its likely configurations basing on the known integral characteristics of a centrifugal plant is developed. Evaluation of characteristics of the industrial gas centrifuge TC-12 and separation cascades of the NEF plant performed by two independent calculation techniques demonstrates their satisfactory agreement. This methodology would help to some extent the nuclear inspectors in evaluating and assessing the capability of an enrichment facility, and discovering any use for undeclared purposes.

  7. Comparison of Automated Atlas Based Segmentation Software for postoperative prostate cancer radiotherapy

    Directory of Open Access Journals (Sweden)

    Grégory Delpon

    2016-08-01

    Full Text Available Automated atlas-based segmentation algorithms present the potential to reduce the variability in volume delineation. Several vendors offer software that are mainly used for cranial, head and neck and prostate cases. The present study will compare the contours produced by a radiation oncologist to the contours computed by different automated atlas-based segmentation algorithms for prostate bed cases, including femoral heads, bladder and rectum. Contour comparison was evaluated by different metrics such as volume ratio, Dice coefficient and Hausdorff distance. Results depended on the volume of interest and showed some discrepancies between the different software. Automatic contours could be a good starting point for the delineation of organs since efficient editing tools are provided by different vendors. It should become an important help in the next few years for organ at risk delineation.

  8. A new fully automated FTIR system for total column measurements of greenhouse gases

    Directory of Open Access Journals (Sweden)

    M. C. Geibel

    2010-10-01

    Full Text Available This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics.

    Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control.

    First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months.

    After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  9. Membrane-based ethylene/ethane separation: The upper bound and beyond

    KAUST Repository

    Rungta, Meha

    2013-08-02

    Ethylene/ethane separation via cryogenic distillation is extremely energy-intensive, and membrane separation may provide an attractive alternative. In this paper, ethylene/ethane separation performance using polymeric membranes is summarized, and an experimental ethylene/ethane polymeric upper bound based on literature data is presented. A theoretical prediction of the ethylene/ethane upper bound is also presented, and shows good agreement with the experimental upper bound. Further, two ways to overcome the ethylene/ethane upper bound, based on increasing the sorption or diffusion selectivity, is also discussed, and a review on advanced membrane types such as facilitated transport membranes, zeolite and metal organic framework based membranes, and carbon molecular sieve membranes is presented. Of these, carbon membranes have shown the potential to surpass the polymeric ethylene/ethane upper bound performance. Furthermore, a convenient, potentially scalable method for tailoring the performance of carbon membranes for ethylene/ethane separation based on tuning the pyrolysis conditions has also been demonstrated. © 2013 American Institute of Chemical Engineers.

  10. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    Science.gov (United States)

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  11. Data Mining for Understanding and Impriving Decision-Making Affecting Ground Delay Programs

    Science.gov (United States)

    Kulkarni, Deepak; Wang, Yao Xun; Sridhar, Banavar

    2013-01-01

    The continuous growth in the demand for air transportation results in an imbalance between airspace capacity and traffic demand. The airspace capacity of a region depends on the ability of the system to maintain safe separation between aircraft in the region. In addition to growing demand, the airspace capacity is severely limited by convective weather. During such conditions, traffic managers at the FAA's Air Traffic Control System Command Center (ATCSCC) and dispatchers at various Airlines' Operations Center (AOC) collaborate to mitigate the demand-capacity imbalance caused by weather. The end result is the implementation of a set of Traffic Flow Management (TFM) initiatives such as ground delay programs, reroute advisories, flow metering, and ground stops. Data Mining is the automated process of analyzing large sets of data and then extracting patterns in the data. Data mining tools are capable of predicting behaviors and future trends, allowing an organization to benefit from past experience in making knowledge-driven decisions. The work reported in this paper is focused on ground delay programs. Data mining algorithms have the potential to develop associations between weather patterns and the corresponding ground delay program responses. If successful, they can be used to improve and standardize TFM decision resulting in better predictability of traffic flows on days with reliable weather forecasts. The approach here seeks to develop a set of data mining and machine learning models and apply them to historical archives of weather observations and forecasts and TFM initiatives to determine the extent to which the theory can predict and explain the observed traffic flow behaviors.

  12. Simulation-Based Optimization for Storage Allocation Problem of Outbound Containers in Automated Container Terminals

    Directory of Open Access Journals (Sweden)

    Ning Zhao

    2015-01-01

    Full Text Available Storage allocation of outbound containers is a key factor of the performance of container handling system in automated container terminals. Improper storage plans of outbound containers make QC waiting inevitable; hence, the vessel handling time will be lengthened. A simulation-based optimization method is proposed in this paper for the storage allocation problem of outbound containers in automated container terminals (SAPOBA. A simulation model is built up by Timed-Colored-Petri-Net (TCPN, used to evaluate the QC waiting time of storage plans. Two optimization approaches, based on Particle Swarm Optimization (PSO and Genetic Algorithm (GA, are proposed to form the complete simulation-based optimization method. Effectiveness of this method is verified by experiment, as the comparison of the two optimization approaches.

  13. Silicon carbide optics for space and ground based astronomical telescopes

    Science.gov (United States)

    Robichaud, Joseph; Sampath, Deepak; Wainer, Chris; Schwartz, Jay; Peton, Craig; Mix, Steve; Heller, Court

    2012-09-01

    Silicon Carbide (SiC) optical materials are being applied widely for both space based and ground based optical telescopes. The material provides a superior weight to stiffness ratio, which is an important metric for the design and fabrication of lightweight space telescopes. The material also has superior thermal properties with a low coefficient of thermal expansion, and a high thermal conductivity. The thermal properties advantages are important for both space based and ground based systems, which typically need to operate under stressing thermal conditions. The paper will review L-3 Integrated Optical Systems - SSG’s (L-3 SSG) work in developing SiC optics and SiC optical systems for astronomical observing systems. L-3 SSG has been fielding SiC optical components and systems for over 25 years. Space systems described will emphasize the recently launched Long Range Reconnaissance Imager (LORRI) developed for JHU-APL and NASA-GSFC. Review of ground based applications of SiC will include supporting L-3 IOS-Brashear’s current contract to provide the 0.65 meter diameter, aspheric SiC secondary mirror for the Advanced Technology Solar Telescope (ATST).

  14. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Automated facial acne assessment from smartphone images

    Science.gov (United States)

    Amini, Mohammad; Vasefi, Fartash; Valdebran, Manuel; Huang, Kevin; Zhang, Haomiao; Kemp, William; MacKinnon, Nicholas

    2018-02-01

    A smartphone mobile medical application is presented, that provides analysis of the health of skin on the face using a smartphone image and cloud-based image processing techniques. The mobile application employs the use of the camera to capture a front face image of a subject, after which the captured image is spatially calibrated based on fiducial points such as position of the iris of the eye. A facial recognition algorithm is used to identify features of the human face image, to normalize the image, and to define facial regions of interest (ROI) for acne assessment. We identify acne lesions and classify them into two categories: those that are papules and those that are pustules. Automated facial acne assessment was validated by performing tests on images of 60 digital human models and 10 real human face images. The application was able to identify 92% of acne lesions within five facial ROIs. The classification accuracy for separating papules from pustules was 98%. Combined with in-app documentation of treatment, lifestyle factors, and automated facial acne assessment, the app can be used in both cosmetic and clinical dermatology. It allows users to quantitatively self-measure acne severity and treatment efficacy on an ongoing basis to help them manage their chronic facial acne.

  16. SU-E-J-132: Automated Segmentation with Post-Registration Atlas Selection Based On Mutual Information

    International Nuclear Information System (INIS)

    Ren, X; Gao, H; Sharp, G

    2015-01-01

    Purpose: The delineation of targets and organs-at-risk is a critical step during image-guided radiation therapy, for which manual contouring is the gold standard. However, it is often time-consuming and may suffer from intra- and inter-rater variability. The purpose of this work is to investigate the automated segmentation. Methods: The automatic segmentation here is based on mutual information (MI), with the atlas from Public Domain Database for Computational Anatomy (PDDCA) with manually drawn contours.Using dice coefficient (DC) as the quantitative measure of segmentation accuracy, we perform leave-one-out cross-validations for all PDDCA images sequentially, during which other images are registered to each chosen image and DC is computed between registered contour and ground truth. Meanwhile, six strategies, including MI, are selected to measure the image similarity, with MI to be the best. Then given a target image to be segmented and an atlas, automatic segmentation consists of: (a) the affine registration step for image positioning; (b) the active demons registration method to register the atlas to the target image; (c) the computation of MI values between the deformed atlas and the target image; (d) the weighted image fusion of three deformed atlas images with highest MI values to form the segmented contour. Results: MI was found to be the best among six studied strategies in the sense that it had the highest positive correlation between similarity measure (e.g., MI values) and DC. For automated segmentation, the weighted image fusion of three deformed atlas images with highest MI values provided the highest DC among four proposed strategies. Conclusion: MI has the highest correlation with DC, and therefore is an appropriate choice for post-registration atlas selection in atlas-based segmentation. Xuhua Ren and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)

  17. SU-E-J-132: Automated Segmentation with Post-Registration Atlas Selection Based On Mutual Information

    Energy Technology Data Exchange (ETDEWEB)

    Ren, X; Gao, H [Shanghai Jiao Tong University, Shanghai, Shanghai (China); Sharp, G [Massachusetts General Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: The delineation of targets and organs-at-risk is a critical step during image-guided radiation therapy, for which manual contouring is the gold standard. However, it is often time-consuming and may suffer from intra- and inter-rater variability. The purpose of this work is to investigate the automated segmentation. Methods: The automatic segmentation here is based on mutual information (MI), with the atlas from Public Domain Database for Computational Anatomy (PDDCA) with manually drawn contours.Using dice coefficient (DC) as the quantitative measure of segmentation accuracy, we perform leave-one-out cross-validations for all PDDCA images sequentially, during which other images are registered to each chosen image and DC is computed between registered contour and ground truth. Meanwhile, six strategies, including MI, are selected to measure the image similarity, with MI to be the best. Then given a target image to be segmented and an atlas, automatic segmentation consists of: (a) the affine registration step for image positioning; (b) the active demons registration method to register the atlas to the target image; (c) the computation of MI values between the deformed atlas and the target image; (d) the weighted image fusion of three deformed atlas images with highest MI values to form the segmented contour. Results: MI was found to be the best among six studied strategies in the sense that it had the highest positive correlation between similarity measure (e.g., MI values) and DC. For automated segmentation, the weighted image fusion of three deformed atlas images with highest MI values provided the highest DC among four proposed strategies. Conclusion: MI has the highest correlation with DC, and therefore is an appropriate choice for post-registration atlas selection in atlas-based segmentation. Xuhua Ren and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)

  18. Sensor-based automated docking of large waste canisters

    International Nuclear Information System (INIS)

    Drotning, W.D.

    1990-01-01

    Sensor-based programmable robots have the potential to speed up remote manipulation operations while protecting operators from exposure to radiation. Conventional master/slave manipulators have proven to be very slow in performing precision remote operations. In addition, inadvertent collisions of remotely manipulated objects with their environment increase the hazards associated with remote handling. This paper describes the development of a robotic system for the sensor-based automated remote manipulation and precision docking of large payloads. Computer vision and proximity sensing are used to control the precision docking of a large object with a passive target cavity. Specifically, a container of nuclear spent fuel on a transport vehicle is mated with an emplacement door on a vertical storage borehole at a waste repository

  19. Planning and Resource Management in an Intelligent Automated Power Management System

    Science.gov (United States)

    Morris, Robert A.

    1991-01-01

    Power system management is a process of guiding a power system towards the objective of continuous supply of electrical power to a set of loads. Spacecraft power system management requires planning and scheduling, since electrical power is a scarce resource in space. The automation of power system management for future spacecraft has been recognized as an important R&D goal. Several automation technologies have emerged including the use of expert systems for automating human problem solving capabilities such as rule based expert system for fault diagnosis and load scheduling. It is questionable whether current generation expert system technology is applicable for power system management in space. The objective of the ADEPTS (ADvanced Electrical Power management Techniques for Space systems) is to study new techniques for power management automation. These techniques involve integrating current expert system technology with that of parallel and distributed computing, as well as a distributed, object-oriented approach to software design. The focus of the current study is the integration of new procedures for automatically planning and scheduling loads with procedures for performing fault diagnosis and control. The objective is the concurrent execution of both sets of tasks on separate transputer processors, thus adding parallelism to the overall management process.

  20. Policy-based secure communication with automatic key management for industrial control and automation systems

    Science.gov (United States)

    Chernoguzov, Alexander; Markham, Thomas R.; Haridas, Harshal S.

    2016-11-22

    A method includes generating at least one access vector associated with a specified device in an industrial process control and automation system. The specified device has one of multiple device roles. The at least one access vector is generated based on one or more communication policies defining communications between one or more pairs of devices roles in the industrial process control and automation system, where each pair of device roles includes the device role of the specified device. The method also includes providing the at least one access vector to at least one of the specified device and one or more other devices in the industrial process control and automation system in order to control communications to or from the specified device.

  1. Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study

    Directory of Open Access Journals (Sweden)

    Cees Buisman

    2013-07-01

    Full Text Available Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data to compare energy and water balance, nutrient recovery, chemical use, effluent quality and land area requirement in four different sanitation concepts: (1 centralized; (2 centralized with source-separation of urine; (3 source-separation of black water, kitchen refuse and grey water; and (4 source-separation of urine, feces, kitchen refuse and grey water. The highest primary energy consumption of 914 MJ/capita(cap/year was attained within the centralized sanitation concept, and the lowest primary energy consumption of 437 MJ/cap/year was attained within source-separation of urine, feces, kitchen refuse and grey water. Grey water bio-flocculation and subsequent grey water sludge co-digestion decreased the primary energy consumption, but was not energetically favorable to couple with grey water effluent reuse. Source-separation of urine improved the energy balance, nutrient recovery and effluent quality, but required larger land area and higher chemical use in the centralized concept.

  2. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  3. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    Simola, K.

    2000-01-01

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  4. Automated spoof-detection for fingerprints using optical coherence tomography

    CSIR Research Space (South Africa)

    Darlow, LN

    2016-05-01

    Full Text Available that they are highly separable, resulting in 100% accuracy regarding spoof-detection, with no false rejections of real fingers. This is the first attempt at fully automated spoof-detection using OCT....

  5. Intelligent Case Based Decision Support System for Online Diagnosis of Automated Production System

    International Nuclear Information System (INIS)

    Ben Rabah, N; Saddem, R; Carre-Menetrier, V; Ben Hmida, F; Tagina, M

    2017-01-01

    Diagnosis of Automated Production System (APS) is a decision-making process designed to detect, locate and identify a particular failure caused by the control law. In the literature, there are three major types of reasoning for industrial diagnosis: the first is model-based, the second is rule-based and the third is case-based. The common and major limitation of the first and the second reasonings is that they do not have automated learning ability. This paper presents an interactive and effective Case Based Decision Support System for online Diagnosis (CB-DSSD) of an APS. It offers a synergy between the Case Based Reasoning (CBR) and the Decision Support System (DSS) in order to support and assist Human Operator of Supervision (HOS) in his/her decision process. Indeed, the experimental evaluation performed on an Interactive Training System for PLC (ITS PLC) that allows the control of a Programmable Logic Controller (PLC), simulating sensors or/and actuators failures and validating the control algorithm through a real time interactive experience, showed the efficiency of our approach. (paper)

  6. Lithography-based automation in the design of program defect masks

    Science.gov (United States)

    Vakanas, George P.; Munir, Saghir; Tejnil, Edita; Bald, Daniel J.; Nagpal, Rajesh

    2004-05-01

    In this work, we are reporting on a lithography-based methodology and automation in the design of Program Defect masks (PDM"s). Leading edge technology masks have ever-shrinking primary features and more pronounced model-based secondary features such as optical proximity corrections (OPC), sub-resolution assist features (SRAF"s) and phase-shifted mask (PSM) structures. In order to define defect disposition specifications for critical layers of a technology node, experience alone in deciding worst-case scenarios for the placement of program defects is necessary but may not be sufficient. MEEF calculations initiated from layout pattern data and their integration in a PDM layout flow provide a natural approach for improvements, relevance and accuracy in the placement of programmed defects. This methodology provides closed-loop feedback between layout and hard defect disposition specifications, thereby minimizing engineering test restarts, improving quality and reducing cost of high-end masks. Apart from SEMI and industry standards, best-known methods (BKM"s) in integrated lithographically-based layout methodologies and automation specific to PDM"s are scarce. The contribution of this paper lies in the implementation of Design-For-Test (DFT) principles to a synergistic interaction of CAD Layout and Aerial Image Simulator to drive layout improvements, highlight layout-to-fracture interactions and output accurate program defect placement coordinates to be used by tools in the mask shop.

  7. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  8. Regimes of data output from an automated scanning system into a computer

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Shaislamov, P.T.

    1984-01-01

    A method is described for accomplishment of rather a complex algorithm of various coordinate and service data transmission from different automated scanning system devices into a monitoring computer in the automated system for processing images from bubble chambers. The accepted data output algorithm and the developed appropriate equipment enable data transmission both in separate words and word arrays

  9. Automated and observer based light field indicator edge evaluation in diagnostic X-ray equipment

    OpenAIRE

    Bottaro, Márcio; Nagy, Balázs Vince; Soares, Fernanda Cristina Salvador; Rosendo, Danilo Cabral

    2017-01-01

    Abstract Introduction To analyze edge detection and optical contrast calculation of light field-indicators used in X-ray via automated- and observer-based methods, and comparison with current standard approaches, which do not give exact definition for light field edge determination. Methods Automated light sensor array was used to measure the penumbra zone of the edge in the standard X-ray equipment, while trained and naïve human observers were asked to mark the light field edge according t...

  10. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  11. Composite separators and redox flow batteries based on porous separators

    Science.gov (United States)

    Li, Bin; Wei, Xiaoliang; Luo, Qingtao; Nie, Zimin; Wang, Wei; Sprenkle, Vincent L.

    2016-01-12

    Composite separators having a porous structure and including acid-stable, hydrophilic, inorganic particles enmeshed in a substantially fully fluorinated polyolefin matrix can be utilized in a number of applications. The inorganic particles can provide hydrophilic characteristics. The pores of the separator result in good selectivity and electrical conductivity. The fluorinated polymeric backbone can result in high chemical stability. Accordingly, one application of the composite separators is in redox flow batteries as low cost membranes. In such applications, the composite separator can also enable additional property-enhancing features compared to ion-exchange membranes. For example, simple capacity control can be achieved through hydraulic pressure by balancing the volumes of electrolyte on each side of the separator. While a porous separator can also allow for volume and pressure regulation, in RFBs that utilize corrosive and/or oxidizing compounds, the composite separators described herein are preferable for their robustness in the presence of such compounds.

  12. Automated path length and M56 measurements at Jefferson Lab

    International Nuclear Information System (INIS)

    Hardy, D.; Tang, J.; Legg, R.

    1997-01-01

    Accurate measurement of path length and path length changes versus momentum (M 56 ) are critical for maintaining minimum beam energy spread in the CEBAF (Continuous Electron Beam Accelerator Facility) accelerator at the Thomas Jefferson National Accelerator Facility (Jefferson Lab). The relative path length for each circuit of the beam (1256m) must be equal within 1.5 degrees of 1497 MHz RF phase. A relative path length measurement is made by measuring the relative phases of RF signals from a cavity that is separately excited for each pass of a 4.2 μs pulsed beam. This method distinguishes the path length to less than 0.5 path length error. The development of a VME based automated measurement system for path length and M 56 has contributed to faster machine setup time and has the potential for use as a feedback parameter for automated control

  13. Model-Based Control for Postal Automation and Baggage Handling

    NARCIS (Netherlands)

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased

  14. Human-Automation Allocations for Current Robotic Space Operations

    Science.gov (United States)

    Marquez, Jessica J.; Chang, Mai L.; Beard, Bettina L.; Kim, Yun Kyung; Karasinski, John A.

    2018-01-01

    Within the Human Research Program, one risk delineates the uncertainty surrounding crew working with automation and robotics in spaceflight. The Risk of Inadequate Design of Human and Automation/Robotic Integration (HARI) is concerned with the detrimental effects on crew performance due to ineffective user interfaces, system designs and/or functional task allocation, potentially compromising mission success and safety. Risk arises because we have limited experience with complex automation and robotics. One key gap within HARI, is the gap related to functional allocation. The gap states: We need to evaluate, develop, and validate methods and guidelines for identifying human-automation/robot task information needs, function allocation, and team composition for future long duration, long distance space missions. Allocations determine the human-system performance as it identifies the functions and performance levels required by the automation/robotic system, and in turn, what work the crew is expected to perform and the necessary human performance requirements. Allocations must take into account each of the human, automation, and robotic systems capabilities and limitations. Some functions may be intuitively assigned to the human versus the robot, but to optimize efficiency and effectiveness, purposeful role assignments will be required. The role of automation and robotics will significantly change in future exploration missions, particularly as crew becomes more autonomous from ground controllers. Thus, we must understand the suitability of existing function allocation methods within NASA as well as the existing allocations established by the few robotic systems that are operational in spaceflight. In order to evaluate future methods of robotic allocations, we must first benchmark the allocations and allocation methods that have been used. We will present 1) documentation of human-automation-robotic allocations in existing, operational spaceflight systems; and 2) To

  15. High energy astrophysics with ground-based gamma ray detectors

    International Nuclear Information System (INIS)

    Aharonian, F; Buckley, J; Kifune, T; Sinnis, G

    2008-01-01

    Recent advances in ground-based gamma ray astronomy have led to the discovery of more than 70 sources of very high energy (E γ ≥ 100 GeV) gamma rays, falling into a number of source populations including pulsar wind nebulae, shell type supernova remnants, Wolf-Rayet stars, giant molecular clouds, binary systems, the Galactic Center, active galactic nuclei and 'dark' (yet unidentified) galactic objects. We summarize the history of TeV gamma ray astronomy up to the current status of the field including a description of experimental techniques and highlight recent astrophysical results. We also discuss the potential of ground-based gamma ray astronomy for future discoveries and describe possible directions for future instrumental developments

  16. Influence of Cultural, Organizational, and Automation Capability on Human Automation Trust: A Case Study of Auto-GCAS Experimental Test Pilots

    Science.gov (United States)

    Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  17. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  18. Improving patient safety via automated laboratory-based adverse event grading.

    Science.gov (United States)

    Niland, Joyce C; Stiller, Tracey; Neat, Jennifer; Londrc, Adina; Johnson, Dina; Pannoni, Susan

    2012-01-01

    The identification and grading of adverse events (AEs) during the conduct of clinical trials is a labor-intensive and error-prone process. This paper describes and evaluates a software tool developed by City of Hope to automate complex algorithms to assess laboratory results and identify and grade AEs. We compared AEs identified by the automated system with those previously assessed manually, to evaluate missed/misgraded AEs. We also conducted a prospective paired time assessment of automated versus manual AE assessment. We found a substantial improvement in accuracy/completeness with the automated grading tool, which identified an additional 17% of severe grade 3-4 AEs that had been missed/misgraded manually. The automated system also provided an average time saving of 5.5 min per treatment course. With 400 ongoing treatment trials at City of Hope and an average of 1800 laboratory results requiring assessment per study, the implications of these findings for patient safety are enormous.

  19. Live demonstration: Screen printed, microwave based level sensor for automated drug delivery

    KAUST Repository

    Karimi, Muhammad Akram

    2018-01-02

    Level sensors find numerous applications in many industries to automate the processes involving chemicals. Recently, some commercial ultrasound based level sensors are also being used to automate the drug delivery process [1]. Some of the most desirable features of level sensors to be used for medical use are their non-intrusiveness, low cost and consistent performance. In this demo, we will present a completely new method of sensing the liquid level using microwaves. It is a common stereotype to consider microwaves sensing mechanism as being expensive. Unlike usual expensive, intrusive and bulky microwave methods of level sensing using guided radars, we will present an extremely low cost printed, non-intrusive microwave sensor to reliably sense the liquid level.

  20. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    Science.gov (United States)

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  1. Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story

    Science.gov (United States)

    Ly, Vuong

    2017-01-01

    The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.

  2. Tunable separations based on a molecular size effect for biomolecules by poly(ethylene glycol) gel-based capillary electrophoresis.

    Science.gov (United States)

    Kubo, Takuya; Nishimura, Naoki; Furuta, Hayato; Kubota, Kei; Naito, Toyohiro; Otsuka, Koji

    2017-11-10

    We report novel capillary gel electrophoresis (CGE) with poly(ethylene glycol) (PEG)-based hydrogels for the effective separations of biomolecules containing sugars and DNAs based on a molecular size effect. The gel capillaries were prepared in a fused silica capillary modified with 3-(trimethoxysilyl)propylmethacrylate using a variety of the PEG-based hydrogels. After the fundamental evaluations in CGE regarding the separation based on the molecular size effect depending on the crosslinking density, the optimized capillary provided the efficient separation of glucose ladder (G1 to G20). In addition, another capillary showed the successful separation of DNA ladder in the range of 10-1100 base pair, which is superior to an authentic acrylamide-based gel capillary. For both glucose and DNA ladders, the separation ranges against the molecular size were simply controllable by alteration of the concentration and/or units of ethylene oxide in the PEG-based crosslinker. Finally, we demonstrated the separations of real samples, which included sugars carved out from monoclonal antibodies, mAbs, and then the efficient separations based on the molecular size effect were achieved. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Ares I-X Ground Diagnostic Prototype

    Science.gov (United States)

    Schwabacher, Mark A.; Martin, Rodney Alexander; Waterman, Robert D.; Oostdyk, Rebecca Lynn; Ossenfort, John P.; Matthews, Bryan

    2010-01-01

    The automation of pre-launch diagnostics for launch vehicles offers three potential benefits: improving safety, reducing cost, and reducing launch delays. The Ares I-X Ground Diagnostic Prototype demonstrated anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage Thrust Vector Control and for the associated ground hydraulics while the vehicle was in the Vehicle Assembly Building at Kennedy Space Center (KSC) and while it was on the launch pad. The prototype combines three existing tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool from Qualtech Systems Inc. for fault isolation and diagnostics. The second tool, SHINE (Spacecraft Health Inference Engine), is a rule-based expert system that was developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification, and used the outputs of SHINE as inputs to TEAMS. The third tool, IMS (Inductive Monitoring System), is an anomaly detection tool that was developed at NASA Ames Research Center. The three tools were integrated and deployed to KSC, where they were interfaced with live data. This paper describes how the prototype performed during the period of time before the launch, including accuracy and computer resource usage. The paper concludes with some of the lessons that we learned from the experience of developing and deploying the prototype.

  4. SCIENTIFIC EFFICIENCY OF GROUND-BASED TELESCOPES

    International Nuclear Information System (INIS)

    Abt, Helmut A.

    2012-01-01

    I scanned the six major astronomical journals of 2008 for all 1589 papers that are based on new data obtained from ground-based optical/IR telescopes worldwide. Then I collected data on numbers of papers, citations to them in 3+ years, the most-cited papers, and annual operating costs. These data are assigned to four groups by telescope aperture. For instance, while the papers from telescopes with an aperture >7 m average 1.29 more citations than those with an aperture of 2 to 7 m) telescopes. I wonder why the large telescopes do so relatively poorly and suggest possible reasons. I also found that papers based on archival data, such as the Sloan Digital Sky Survey, produce 10.6% as many papers and 20.6% as many citations as those based on new data. Also, the 577.2 papers based on radio data produced 36.3% as many papers and 33.6% as many citations as the 1589 papers based on optical/IR telescopes.

  5. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  6. Surveillance and Datalink Communication Performance Analysis for Distributed Separation Assurance System Architectures

    Science.gov (United States)

    Chung, William W.; Linse, Dennis J.; Alaverdi, Omeed; Ifarraguerri, Carlos; Seifert, Scott C.; Salvano, Dan; Calender, Dale

    2012-01-01

    This study investigates the effects of two technical enablers: Automatic Dependent Surveillance - Broadcast (ADS-B) and digital datalink communication, of the Federal Aviation Administration s Next Generation Air Transportation System (NextGen) under two separation assurance (SA) system architectures: ground-based SA and airborne SA, on overall separation assurance performance. Datalink performance such as successful reception probability in both surveillance and communication messages, and surveillance accuracy are examined in various operational conditions. Required SA performance is evaluated as a function of subsystem performance, using availability, continuity, and integrity metrics to establish overall required separation assurance performance, under normal and off-nominal conditions.

  7. Automated Generation of OCL Constraints: NL based Approach vs Pattern Based Approach

    Directory of Open Access Journals (Sweden)

    IMRAN SARWAR BAJWA

    2017-04-01

    Full Text Available This paper presents an approach used for automated generations of software constraints. In this model, the SBVR (Semantics of Business Vocabulary and Rules based semi-formal representation is obtained from the syntactic and semantic analysis of a NL (Natural Language (such as English sentence. A SBVR representation is easy to translate to other formal languages as SBVR is based on higher-order logic like other formal languages such as OCL (Object Constraint Language. The proposed model endows with a systematic and powerful system of incorporating NL knowledge on the formal languages. A prototype is constructed in Java (an Eclipse plug-in as a proof of the concept. The performance was tested for a few sample texts taken from existing research thesis reports and books

  8. Performance Based Criteria for Ship Collision and Grounding

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2009-01-01

    The paper outlines a probabilistic procedure whereby the maritime industry can develop performance based rules to reduce the risk associated with human, environmental and economic costs of collision and grounding events and identify the most economic risk control options associated with prevention...

  9. Robotium automated testing for Android

    CERN Document Server

    Zadgaonkar, Hrushikesh

    2013-01-01

    This is a step-by-step, example-oriented tutorial aimed at illustrating the various test scenarios and automation capabilities of Robotium.If you are an Android developer who is learning how to create test cases to test their application, and are looking to get a good grounding in different features in Robotium, this book is ideal for you. It's assumed that you have some experience in Android development, as well be familiar with the Android test framework, as Robotium is a wrapper to Android test framework.

  10. The separation of vibrational coherence from ground- and excited-electronic states in P3HT film

    KAUST Repository

    Song, Yin

    2015-06-07

    © 2015 AIP Publishing LLC. Concurrence of the vibrational coherence and ultrafast electron transfer has been observed in polymer/fullerene blends. However, it is difficult to experimentally investigate the role that the excited-state vibrational coherence plays during the electron transfer process since vibrational coherence from the ground- and excited-electronic states is usually temporally and spectrally overlapped. Here, we performed 2-dimensional electronic spectroscopy (2D ES) measurements on poly(3-hexylthiophene) (P3HT) films. By Fourier transforming the whole 2D ES datasets (S (λ 1, T∼ 2, λ 3)) along the population time (T∼ 2) axis, we develop and propose a protocol capable of separating vibrational coherence from the ground- and excited-electronic states in 3D rephasing and nonrephasing beating maps (S (λ 1, ν∼ 2, λ 3)). We found that the vibrational coherence from pure excited electronic states appears at positive frequency (+ ν∼ 2) in the rephasing beating map and at negative frequency (- ν∼ 2) in the nonrephasing beating map. Furthermore, we also found that vibrational coherence from excited electronic state had a long dephasing time of 244 fs. The long-lived excited-state vibrational coherence indicates that coherence may be involved in the electron transfer process. Our findings not only shed light on the mechanism of ultrafast electron transfer in organic photovoltaics but also are beneficial for the study of the coherence effect on photoexcited dynamics in other systems.

  11. The separation of vibrational coherence from ground- and excited-electronic states in P3HT film

    International Nuclear Information System (INIS)

    Song, Yin; Hellmann, Christoph; Stingelin, Natalie; Scholes, Gregory D.

    2015-01-01

    Concurrence of the vibrational coherence and ultrafast electron transfer has been observed in polymer/fullerene blends. However, it is difficult to experimentally investigate the role that the excited-state vibrational coherence plays during the electron transfer process since vibrational coherence from the ground- and excited-electronic states is usually temporally and spectrally overlapped. Here, we performed 2-dimensional electronic spectroscopy (2D ES) measurements on poly(3-hexylthiophene) (P3HT) films. By Fourier transforming the whole 2D ES datasets (S(λ 1 ,T ~ 2 ,λ 3 )) along the population time (T ~ 2 ) axis, we develop and propose a protocol capable of separating vibrational coherence from the ground- and excited-electronic states in 3D rephasing and nonrephasing beating maps (S(λ 1 ,ν ~ 2 ,λ 3 )). We found that the vibrational coherence from pure excited electronic states appears at positive frequency (+ν ~ 2 ) in the rephasing beating map and at negative frequency (−ν ~ 2 ) in the nonrephasing beating map. Furthermore, we also found that vibrational coherence from excited electronic state had a long dephasing time of 244 fs. The long-lived excited-state vibrational coherence indicates that coherence may be involved in the electron transfer process. Our findings not only shed light on the mechanism of ultrafast electron transfer in organic photovoltaics but also are beneficial for the study of the coherence effect on photoexcited dynamics in other systems

  12. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained....... The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  13. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  14. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method

    Science.gov (United States)

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.

    2017-08-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  15. Ground Control Point - Wireless System Network for UAV-based environmental monitoring applications

    Science.gov (United States)

    Mejia-Aguilar, Abraham

    2016-04-01

    In recent years, Unmanned Aerial Vehicles (UAV) have seen widespread civil applications including usage for survey and monitoring services in areas such as agriculture, construction and civil engineering, private surveillance and reconnaissance services and cultural heritage management. Most aerial monitoring services require the integration of information acquired during the flight (such as imagery) with ground-based information (such as GPS information or others) for improved ground truth validation. For example, to obtain an accurate 3D and Digital Elevation Model based on aerial imagery, it is necessary to include ground-based information of coordinate points, which are normally acquired with surveying methods based on Global Position Systems (GPS). However, GPS surveys are very time consuming and especially for longer time series of monitoring data repeated GPS surveys are necessary. In order to improve speed of data collection and integration, this work presents an autonomous system based on Waspmote technologies build on single nodes interlinked in a Wireless Sensor Network (WSN) star-topology for ground based information collection and later integration with surveying data obtained by UAV. Nodes are designed to be visible from the air, to resist extreme weather conditions with low-power consumption. Besides, nodes are equipped with GPS as well as Inertial Measurement Unit (IMU), accelerometer, temperature and soil moisture sensors and thus provide significant advantages in a broad range of applications for environmental monitoring. For our purpose, the WSN transmits the environmental data with 3G/GPRS to a database on a regular time basis. This project provides a detailed case study and implementation of a Ground Control Point System Network for UAV-based vegetation monitoring of dry mountain grassland in the Matsch valley, Italy.

  16. Rapid filtration separation-based sample preparation method for Bacillus spores in powdery and environmental matrices.

    Science.gov (United States)

    Isabel, Sandra; Boissinot, Maurice; Charlebois, Isabelle; Fauvel, Chantal M; Shi, Lu-E; Lévesque, Julie-Christine; Paquin, Amélie T; Bastien, Martine; Stewart, Gale; Leblanc, Eric; Sato, Sachiko; Bergeron, Michel G

    2012-03-01

    Authorities frequently need to analyze suspicious powders and other samples for biothreat agents in order to assess environmental safety. Numerous nucleic acid detection technologies have been developed to detect and identify biowarfare agents in a timely fashion. The extraction of microbial nucleic acids from a wide variety of powdery and environmental samples to obtain a quality level adequate for these technologies still remains a technical challenge. We aimed to develop a rapid and versatile method of separating bacteria from these samples and then extracting their microbial DNA. Bacillus atrophaeus subsp. globigii was used as a simulant of Bacillus anthracis. We studied the effects of a broad variety of powdery and environmental samples on PCR detection and the steps required to alleviate their interference. With a benchmark DNA extraction procedure, 17 of the 23 samples investigated interfered with bacterial lysis and/or PCR-based detection. Therefore, we developed the dual-filter method for applied recovery of microbial particles from environmental and powdery samples (DARE). The DARE procedure allows the separation of bacteria from contaminating matrices that interfere with PCR detection. This procedure required only 2 min, while the DNA extraction process lasted 7 min, for a total of sample preparation procedure allowed the recovery of cleaned bacterial spores and relieved detection interference caused by a wide variety of samples. Our procedure was easily completed in a laboratory facility and is amenable to field application and automation.

  17. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges....

  18. An Automated Approach to Syntax-based Analysis of Classical Latin

    Directory of Open Access Journals (Sweden)

    Anjalie Field

    2016-12-01

    Full Text Available The goal of this study is to present an automated method for analyzing the style of Latin authors. Many of the common automated methods in stylistic analysis are based on lexical measures, which do not work well with Latin because of the language’s high degree of inflection and free word order. In contrast, this study focuses on analysis at a syntax level by examining two constructions, the ablative absolute and the cum clause. These constructions are often interchangeable, which suggests an author’s choice of construction is typically more stylistic than functional. We first identified these constructions in hand-annotated texts. Next we developed a method for identifying the constructions in unannotated texts, using probabilistic morphological tagging. Our methods identified constructions with enough accuracy to distinguish among different genres and different authors. In particular, we were able to determine which book of Caesar’s Commentarii de Bello Gallico was not written by Caesar. Furthermore, the usage of ablative absolutes and cum clauses observed in this study is consistent with the usage scholars have observed when analyzing these texts by hand. The proposed methods for an automatic syntax-based analysis are shown to be valuable for the study of classical literature.

  19. Highly Automated Arrival Management and Control System Suitable for Early NextGen

    Science.gov (United States)

    Swenson, Harry N.; Jung, Jaewoo

    2013-01-01

    This is a presentation of previously published work conducted in the development of the Terminal Area Precision Scheduling and Spacing (TAPSS) system. Included are concept and technical descriptions of the TAPSS system and results from human in the loop simulations conducted at Ames Research Center. The Terminal Area Precision Scheduling and Spacing system has demonstrated through research and extensive high-fidelity simulation studies to have benefits in airport arrival throughput, supporting efficient arrival descents, and enabling mixed aircraft navigation capability operations during periods of high congestion. NASA is currently porting the TAPSS system into the FAA TBFM and STARS system prototypes to ensure its ability to operate in the FAA automation Infrastructure. NASA ATM Demonstration Project is using the the TAPSS technologies to provide the ground-based automation tools to enable airborne Interval Management (IM) capabilities. NASA and the FAA have initiated a Research Transition Team to enable potential TAPSS and IM Technology Transfer.

  20. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  1. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging.

    Science.gov (United States)

    Patel, Tapan P; Man, Karen; Firestein, Bonnie L; Meaney, David F

    2015-03-30

    Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s-1000+neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. Copyright © 2015. Published by Elsevier B.V.

  2. Organic electronic materials: Recent advances in the dft description of the ground and excited states using tuned range-separated hybrid functionals

    KAUST Repository

    Körzdörfer, Thomas

    2014-11-18

    Density functional theory (DFT) and its time-dependent extension (TD-DFT) are powerful tools enabling the theoretical prediction of the ground- and excited-state properties of organic electronic materials with reasonable accuracy at affordable computational costs. Due to their excellent accuracy-to-numerical-costs ratio, semilocal and global hybrid functionals such as B3LYP have become the workhorse for geometry optimizations and the prediction of vibrational spectra in modern theoretical organic chemistry. Despite the overwhelming success of these out-of-the-box functionals for such applications, the computational treatment of electronic and structural properties that are of particular interest in organic electronic materials sometimes reveals severe and qualitative failures of such functionals. Important examples include the overestimation of conjugation, torsional barriers, and electronic coupling as well as the underestimation of bond-length alternations or excited-state energies in low-band-gap polymers.In this Account, we highlight how these failures can be traced back to the delocalization error inherent to semilocal and global hybrid functionals, which leads to the spurious delocalization of electron densities and an overestimation of conjugation. The delocalization error for systems and functionals of interest can be quantified by allowing for fractional occupation of the highest occupied molecular orbital. It can be minimized by using long-range corrected hybrid functionals and a nonempirical tuning procedure for the range-separation parameter.We then review the benefits and drawbacks of using tuned long-range corrected hybrid functionals for the description of the ground and excited states of π-conjugated systems. In particular, we show that this approach provides for robust and efficient means of characterizing the electronic couplings in organic mixed-valence systems, for the calculation of accurate torsional barriers at the polymer limit, and for the

  3. Web-Altairis: An Internet-Enabled Ground System

    Science.gov (United States)

    Miller, Phil; Coleman, Jason; Gemoets, Darren; Hughes, Kevin

    2000-01-01

    This paper describes Web-Altairis, an Internet-enabled ground system software package funded by the Advanced Automation and Architectures Branch (Code 588) of NASA's Goddard Space Flight Center. Web-Altairis supports the trend towards "lights out" ground systems, where the control center is unattended and problems are resolved by remote operators. This client/server software runs on most popular platforms and provides for remote data visualization using the rich functionality of the VisAGE toolkit. Web-Altairis also supports satellite commanding over the Internet. This paper describes the structure of Web-Altairis and VisAGE, the underlying technologies, the provisions for security, and our experiences in developing and testing the software.

  4. Intelligent systems for KSC ground processing

    Science.gov (United States)

    Heard, Astrid E.

    1992-01-01

    The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

  5. A LabVIEW®-based software for the control of the AUTORAD platform. A fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis

    International Nuclear Information System (INIS)

    Barbesi, Donato; Vilas, Victor Vicente; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Heras, Laura Aldave de las

    2017-01-01

    A LabVIEW®-based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino®-based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW®VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste. (author)

  6. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    Science.gov (United States)

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  7. Canvas supports and grounds in paintings by C.W. Eckersberg

    DEFF Research Database (Denmark)

    Filtenborg, Troels; Andersen, Cecil Krarup

    2017-01-01

    The supports and grounds in 43 paintings on canvas by C.W. Eckersberg, dating from throughout his career, were investigated by visual examination, X-radiography, computer assisted automated thread counting and weave mapping, as well as by cross section analysis. The analytical data were complemen...

  8. Proceedings. Fourth international symposium on mine mechanisation and automation

    Energy Technology Data Exchange (ETDEWEB)

    Gurgenci, H.; Hood, M. [eds.

    1997-12-31

    Papers in the first volume are presented under the following session headings: drilling; mining robotics; machine monitoring; mine automation systems; reliability and maintenance; mine automation - communications mechanical excavation of medium-strength rock; and new mining equipment technologies. The second volume covers: mechanical excavation of hard rock; autonomous vehicles; mechanical excavation industry experience; machine guidance; applications of rock mechanics, mine planning management and scheduling; orebody delineation; and safety. Selected papers have been abstracted separately for the IEA Coal Research databases available on CD-ROM and the worldwide web.

  9. Automated Ground-based Time-lapse Camera Monitoring of West Greenland ice sheet outlet Glaciers: Challenges and Solutions

    Science.gov (United States)

    Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.

    2008-12-01

    Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous

  10. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  11. Alert management for home healthcare based on home automation analysis.

    Science.gov (United States)

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  12. Automated and model-based assembly of an anamorphic telescope

    Science.gov (United States)

    Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.

  13. Improving Agricultural Water Resources Management Using Ground-based Infrared Thermometry

    Science.gov (United States)

    Taghvaeian, S.

    2014-12-01

    Irrigated agriculture is the largest user of freshwater resources in arid/semi-arid parts of the world. Meeting rapidly growing demands in food, feed, fiber, and fuel while minimizing environmental pollution under a changing climate requires significant improvements in agricultural water management and irrigation scheduling. Although recent advances in remote sensing techniques and hydrological modeling has provided valuable information on agricultural water resources and their management, real improvements will only occur if farmers, the decision makers on the ground, are provided with simple, affordable, and practical tools to schedule irrigation events. This presentation reviews efforts in developing methods based on ground-based infrared thermometry and thermography for day-to-day management of irrigation systems. The results of research studies conducted in Colorado and Oklahoma show that ground-based remote sensing methods can be used effectively in quantifying water stress and consequently triggering irrigation events. Crop water use estimates based on stress indices have also showed to be in good agreement with estimates based on other methods (e.g. surface energy balance, root zone soil water balance, etc.). Major challenges toward the adoption of this approach by agricultural producers include the reduced accuracy under cloudy and humid conditions and its inability to forecast irrigation date, which is a critical knowledge since many irrigators need to decide about irrigations a few days in advance.

  14. Modern approaches to agent-based complex automated negotiation

    CERN Document Server

    Bai, Quan; Ito, Takayuki; Zhang, Minjie; Ren, Fenghui; Aydoğan, Reyhan; Hadfi, Rafik

    2017-01-01

    This book addresses several important aspects of complex automated negotiations and introduces a number of modern approaches for facilitating agents to conduct complex negotiations. It demonstrates that autonomous negotiation is one of the most important areas in the field of autonomous agents and multi-agent systems. Further, it presents complex automated negotiation scenarios that involve negotiation encounters that may have, for instance, a large number of agents, a large number of issues with strong interdependencies and/or real-time constraints.

  15. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    Directory of Open Access Journals (Sweden)

    Chinmay A. Shukla

    2017-05-01

    Full Text Available The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature.

  16. Completely automated modal analysis procedure based on the combination of different OMA methods

    Science.gov (United States)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  17. 31 CFR 205.17 - Are funds transfers delayed by automated payment systems restrictions based on the size and...

    Science.gov (United States)

    2010-07-01

    ... automated payment systems restrictions based on the size and timing of the drawdown request subject to this... Treasury-State Agreement § 205.17 Are funds transfers delayed by automated payment systems restrictions... to payment processes that automatically reject drawdown requests that fall outside a pre-determined...

  18. Testing a ground-based canopy model using the wind river canopy crane

    Science.gov (United States)

    Robert Van Pelt; Malcolm P. North

    1999-01-01

    A ground-based canopy model that estimates the volume of occupied space in forest canopies was tested using the Wind River Canopy Crane. A total of 126 trees in a 0.25 ha area were measured from the ground and directly from a gondola suspended from the crane. The trees were located in a low elevation, old-growth forest in the southern Washington Cascades. The ground-...

  19. Space weather effects on ground based technology

    Science.gov (United States)

    Clark, T.

    Space weather can affect a variety of forms of ground-based technology, usually as a result of either the direct effects of the varying geomagnetic field, or as a result of the induced electric field that accompanies such variations. Technologies affected directly by geomagnetic variations include magnetic measurements made d ringu geophysical surveys, and navigation relying on the geomagnetic field as a direction reference, a method that is particularly common in the surveying of well-bores in the oil industry. The most obvious technology affected by induced electric fields during magnetic storms is electric power transmission, where the example of the blackout in Quebec during the March 1989 magnetic storm is widely known. Additionally, space weather effects must be taken into account in the design of active cathodic protection systems on pipelines to protect them against corrosion. Long-distance telecommunication cables may also have to be designed to cope with space weather related effects. This paper reviews the effects of space weather in these different areas of ground-based technology, and provides examples of how mitigation against hazards may be achieved. (The paper does not include the effects of space weather on radio communication or satellite navigation systems).

  20. Intelligent automation of high-performance liquid chromatography method development by means of a real-time knowledge-based approach.

    Science.gov (United States)

    I, Ting-Po; Smith, Randy; Guhan, Sam; Taksen, Ken; Vavra, Mark; Myers, Douglas; Hearn, Milton T W

    2002-09-27

    We describe the development, attributes and capabilities of a novel type of artificial intelligence system, called LabExpert, for automation of HPLC method development. Unlike other computerised method development systems, LabExpert operates in real-time, using an artificial intelligence system and design engine to provide experimental decision outcomes relevant to the optimisation of complex separations as well as the control of the instrumentation, column selection, mobile phase choice and other experimental parameters. LabExpert manages every input parameter to a HPLC data station and evaluates each output parameter of the HPLC data station in real-time as part of its decision process. Based on a combination of inherent and user-defined evaluation criteria, the artificial intelligence system programs use a reasoning process, applying chromatographic principles and acquired experimental observations to iteratively provide a regime for a priori development of an acceptable HPLC separation method. Because remote monitoring and control are also functions of LabExpert, the system allows full-time utilisation of analytical instrumentation and associated laboratory resources. Based on our experience with LabExpert with a wide range of analyte mixtures, this artificial intelligence system consistently identified in a similar or faster time-frame preferred sets of analytical conditions that are equal in resolution, efficiency and throughput to those empirically determined by highly experienced chromatographic scientists. An illustrative example, demonstrating the potential of LabExpert in the process of method development of drug substances, is provided.

  1. Operational experiences with automated acoustic burst classification by neural networks

    International Nuclear Information System (INIS)

    Olma, B.; Ding, Y.; Enders, R.

    1996-01-01

    Monitoring of Loose Parts Monitoring System sensors for signal bursts associated with metallic impacts of loose parts has proved as an useful methodology for on-line assessing the mechanical integrity of components in the primary circuit of nuclear power plants. With the availability of neural networks new powerful possibilities for classification and diagnosis of burst signals can be realized for acoustic monitoring with the online system RAMSES. In order to look for relevant burst signals an automated classification is needed, that means acoustic signature analysis and assessment has to be performed automatically on-line. A back propagation neural network based on five pre-calculated signal parameter values has been set up for identification of different signal types. During a three-month monitoring program of medium-operated check valves burst signals have been measured and classified separately according to their cause. The successful results of the three measurement campaigns with an automated burst type classification are presented. (author)

  2. Projective synchronization based on suitable separation

    International Nuclear Information System (INIS)

    Li Guohui; Xiong Chuan; Sun Xiaonan

    2007-01-01

    A new approach for constructing a projective-synchronized chaotic slave system is proposed in this paper. This method is based on suitable separation by decomposing the system as the linear part and the nonlinear one. From matrix measure theory, some simple but efficient criteria are derived for projective synchronization of chaotic system. Numerical simulations for the Lorenz system show that this control method works very well

  3. Automated Attitude Sensor Calibration: Progress and Plans

    Science.gov (United States)

    Sedlak, Joseph; Hashmall, Joseph

    2004-01-01

    This paper describes ongoing work a NASA/Goddard Space Flight Center to improve the quality of spacecraft attitude sensor calibration and reduce costs by automating parts of the calibration process. The new calibration software can autonomously preview data quality over a given time span, select a subset of the data for processing, perform the requested calibration, and output a report. This level of automation is currently being implemented for two specific applications: inertial reference unit (IRU) calibration and sensor alignment calibration. The IRU calibration utility makes use of a sequential version of the Davenport algorithm. This utility has been successfully tested with simulated and actual flight data. The alignment calibration is still in the early testing stage. Both utilities will be incorporated into the institutional attitude ground support system.

  4. Ion chromatographic separation for analysis of radiostrontium in nuclear reprocessing solutions of high ionic strength

    International Nuclear Information System (INIS)

    Lamb, J.D.; Nordmeyer, F.R.; Drake, P.A.; Elder, M.P.; Miles, R.W.

    1989-01-01

    An ion chromatography (IC)-based method was developed for Sr 2+ concentration and separation showing high recoveries of strontium. This procedure permits complete automation. One of the potential weaknesses of the IC approach to sample preconcentration, i.e. sensitivity to solutions of high acid content, common in nuclear reprocessing solution, has been overcome by a novel application of acid suppression technology. (author) 12 refs.; 8 figs.; 3 tabs

  5. Tropospheric and total ozone columns over Paris (France measured using medium-resolution ground-based solar-absorption Fourier-transform infrared spectroscopy

    Directory of Open Access Journals (Sweden)

    C. Viatte

    2011-10-01

    Full Text Available Ground-based Fourier-transform infrared (FTIR solar absorption spectroscopy is a powerful remote sensing technique providing information on the vertical distribution of various atmospheric constituents. This work presents the first evaluation of a mid-resolution ground-based FTIR to measure tropospheric ozone, independently of stratospheric ozone. This is demonstrated using a new atmospheric observatory (named OASIS for "Observations of the Atmosphere by Solar absorption Infrared Spectroscopy", installed in Créteil (France. The capacity of the technique to separate stratospheric and tropospheric ozone is demonstrated. Daily mean tropospheric ozone columns derived from the Infrared Atmospheric Sounding Interferometer (IASI and from OASIS measurements are compared for summer 2009 and a good agreement of −5.6 (±16.1 % is observed. Also, a qualitative comparison between in-situ surface ozone measurements and OASIS data reveals OASIS's capacity to monitor seasonal tropospheric ozone variations, as well as ozone pollution episodes in summer 2009 around Paris. Two extreme pollution events are identified (on the 1 July and 6 August 2009 for which ozone partial columns from OASIS and predictions from a regional air-quality model (CHIMERE are compared following strict criteria of temporal and spatial coincidence. An average bias of 0.2%, a mean square error deviation of 7.6%, and a correlation coefficient of 0.91 is found between CHIMERE and OASIS, demonstrating the potential of a mid-resolution FTIR instrument in ground-based solar absorption geometry for tropospheric ozone monitoring.

  6. 2015 Chinese Intelligent Automation Conference

    CERN Document Server

    Li, Hongbo

    2015-01-01

    Proceedings of the 2015 Chinese Intelligent Automation Conference presents selected research papers from the CIAC’15, held in Fuzhou, China. The topics include adaptive control, fuzzy control, neural network based control, knowledge based control, hybrid intelligent control, learning control, evolutionary mechanism based control, multi-sensor integration, failure diagnosis, reconfigurable control, etc. Engineers and researchers from academia, industry and the government can gain valuable insights into interdisciplinary solutions in the field of intelligent automation.

  7. Automated analysis of PET based in-vivo monitoring in ion beam therapy

    International Nuclear Information System (INIS)

    Kuess, P.

    2014-01-01

    Particle Therapy (PT)-PET is currently the only clinically approved in-vivo method for monitoring PT. Due to fragmentation processes in the patients' tissue and the beam projectiles, a beta plus activity distribution (BAD) can be measured during or shortly after the irradiation. The recorded activity map can not be directly compared to the planned dose distribution. However, by means of a Monte Carlo (MC) simulation it is possible to predict the measured BAD from a treatment plan (TP). Thus to verify a patient's treatment fraction the actual PET measurement can be compared to the respective BAD prediction. This comparison is currently performed by visual inspection which requires experienced evaluators and is rather time consuming. In this PhD thesis an evaluation tool is presented to compare BADs in an automated and objective way. The evaluation method was based on the Pearson's correlation coefficient (PCC) – an established measure in medical image processing – which was coded into a software tool. The patient data used to develop, test and validate the software tool were acquired at the GSI research facility where over 400 patient treatments with 12C were monitored by means of an in-beam PET prototype. The number of data sets was increased by artificially altering BAD to simulate different beam ranges. The automated detection tool was tested in head and neck (H&N), prostate, lung, and brain. To generate carbon ion TPs the treatment planning system TRiP98 was used for all cases. From these TPs the respective BAD predictions were derived. Besides the detection of range deviations by means of PT-PET also the automated detection of patient setup uncertainties was investigated. Although all measured patient data were recorded during the irradiation (in-beam) also scenarios performing PET scans shortly after the irradiation (in-room) were considered. To analyze the achievable precision of PT-PET with the automated evaluation tool based on

  8. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  9. A real-time automated quality control of rain gauge data based on multiple sensors

    Science.gov (United States)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  10. Smashing the Stovepipe: Leveraging the GMSEC Open Architecture and Advanced IT Automation to Rapidly Prototype, Develop and Deploy Next-Generation Multi-Mission Ground Systems

    Science.gov (United States)

    Swenson, Paul

    2017-01-01

    Satellite/Payload Ground Systems - Typically highly-customized to a specific mission's use cases - Utilize hundreds (or thousands!) of specialized point-to-point interfaces for data flows / file transfers Documentation and tracking of these complex interfaces requires extensive time to develop and extremely high staffing costs Implementation and testing of these interfaces are even more cost-prohibitive, and documentation often lags behind implementation resulting in inconsistencies down the road With expanding threat vectors, IT Security, Information Assurance and Operational Security have become key Ground System architecture drivers New Federal security-related directives are generated on a daily basis, imposing new requirements on current / existing ground systems - These mandated activities and data calls typically carry little or no additional funding for implementation As a result, Ground System Sustaining Engineering groups and Information Technology staff continually struggle to keep up with the rolling tide of security Advancing security concerns and shrinking budgets are pushing these large stove-piped ground systems to begin sharing resources - I.e. Operational / SysAdmin staff, IT security baselines, architecture decisions or even networks / hosting infrastructure Refactoring these existing ground systems into multi-mission assets proves extremely challenging due to what is typically very tight coupling between legacy components As a result, many "Multi-Mission" ops. environments end up simply sharing compute resources and networks due to the difficulty of refactoring into true multi-mission systems Utilizing continuous integration / rapid system deployment technologies in conjunction with an open architecture messaging approach allows System Engineers and Architects to worry less about the low-level details of interfaces between components and configuration of systems GMSEC messaging is inherently designed to support multi-mission requirements, and

  11. An Improved CO2 Separation and Purification System Based on Cryogenic Separation and Distillation Theory

    Directory of Open Access Journals (Sweden)

    Gang Xu

    2014-05-01

    Full Text Available In this study, an improved CO2 separation and purification system is proposed based on in-depth analyses of cryogenic separation and distillation theory as well as the phase transition characteristics of gas mixtures containing CO2. Multi-stage compression, refrigeration, and separation are adopted to separate the majority of the CO2 from the gas mixture with relatively low energy penalty and high purity. Subsequently, the separated crude liquid CO2 is distilled under high pressure and near ambient temperature conditions so that low energy penalty purification is achieved. Simulation results indicate that the specific energy consumption for CO2 capture is only 0.425 MJ/kgCO2 with 99.9% CO2 purity for the product. Techno-economic analysis shows that the total plant investment is relatively low. Given its technical maturity and great potential in large-scale production, compared to conventional MEA and SelexolTM absorption methods, the cost of CO2 capture of the proposed system is reduced by 57.2% and 45.9%, respectively. The result of this study can serve as a novel approach to recovering CO2 from high CO2 concentration gas mixtures.

  12. Automated retinofugal visual pathway reconstruction with multi-shell HARDI and FOD-based analysis.

    Science.gov (United States)

    Kammen, Alexandra; Law, Meng; Tjan, Bosco S; Toga, Arthur W; Shi, Yonggang

    2016-01-15

    Diffusion MRI tractography provides a non-invasive modality to examine the human retinofugal projection, which consists of the optic nerves, optic chiasm, optic tracts, the lateral geniculate nuclei (LGN) and the optic radiations. However, the pathway has several anatomic features that make it particularly challenging to study with tractography, including its location near blood vessels and bone-air interface at the base of the cerebrum, crossing fibers at the chiasm, somewhat-tortuous course around the temporal horn via Meyer's Loop, and multiple closely neighboring fiber bundles. To date, these unique complexities of the visual pathway have impeded the development of a robust and automated reconstruction method using tractography. To overcome these challenges, we develop a novel, fully automated system to reconstruct the retinofugal visual pathway from high-resolution diffusion imaging data. Using multi-shell, high angular resolution diffusion imaging (HARDI) data, we reconstruct precise fiber orientation distributions (FODs) with high order spherical harmonics (SPHARM) to resolve fiber crossings, which allows the tractography algorithm to successfully navigate the complicated anatomy surrounding the retinofugal pathway. We also develop automated algorithms for the identification of ROIs used for fiber bundle reconstruction. In particular, we develop a novel approach to extract the LGN region of interest (ROI) based on intrinsic shape analysis of a fiber bundle computed from a seed region at the optic chiasm to a target at the primary visual cortex. By combining automatically identified ROIs and FOD-based tractography, we obtain a fully automated system to compute the main components of the retinofugal pathway, including the optic tract and the optic radiation. We apply our method to the multi-shell HARDI data of 215 subjects from the Human Connectome Project (HCP). Through comparisons with post-mortem dissection measurements, we demonstrate the retinotopic

  13. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer's accuracy of 93% and a user's accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  14. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer’s accuracy of 93% and a user’s accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  15. Automated Detection of Clinically Significant Prostate Cancer in mp-MRI Images Based on an End-to-End Deep Neural Network.

    Science.gov (United States)

    Wang, Zhiwei; Liu, Chaoyue; Cheng, Danpeng; Wang, Liang; Yang, Xin; Cheng, Kwang-Ting

    2018-05-01

    Automated methods for detecting clinically significant (CS) prostate cancer (PCa) in multi-parameter magnetic resonance images (mp-MRI) are of high demand. Existing methods typically employ several separate steps, each of which is optimized individually without considering the error tolerance of other steps. As a result, they could either involve unnecessary computational cost or suffer from errors accumulated over steps. In this paper, we present an automated CS PCa detection system, where all steps are optimized jointly in an end-to-end trainable deep neural network. The proposed neural network consists of concatenated subnets: 1) a novel tissue deformation network (TDN) for automated prostate detection and multimodal registration and 2) a dual-path convolutional neural network (CNN) for CS PCa detection. Three types of loss functions, i.e., classification loss, inconsistency loss, and overlap loss, are employed for optimizing all parameters of the proposed TDN and CNN. In the training phase, the two nets mutually affect each other and effectively guide registration and extraction of representative CS PCa-relevant features to achieve results with sufficient accuracy. The entire network is trained in a weakly supervised manner by providing only image-level annotations (i.e., presence/absence of PCa) without exact priors of lesions' locations. Compared with most existing systems which require supervised labels, e.g., manual delineation of PCa lesions, it is much more convenient for clinical usage. Comprehensive evaluation based on fivefold cross validation using 360 patient data demonstrates that our system achieves a high accuracy for CS PCa detection, i.e., a sensitivity of 0.6374 and 0.8978 at 0.1 and 1 false positives per normal/benign patient.

  16. Fuel lattice design in a boiling water reactor using a knowledge-based automation system

    International Nuclear Information System (INIS)

    Tung, Wu-Hsiung; Lee, Tien-Tso; Kuo, Weng-Sheng; Yaur, Shung-Jung

    2015-01-01

    Highlights: • An automation system was developed for the fuel lattice radial design of BWRs. • An enrichment group peaking equalizing method is applied to optimize the design. • Several heuristic rules and restrictions are incorporated to facilitate the design. • The CPU time for the system to design a 10x10 lattice was less than 1.2 h. • The beginning-of-life LPF was improved from 1.319 to 1.272 for one of the cases. - Abstract: A knowledge-based fuel lattice design automation system for BWRs is developed and applied to the design of 10 × 10 fuel lattices. The knowledge implemented in this fuel lattice design automation system includes the determination of gadolinium fuel pin location, the determination of fuel pin enrichment and enrichment distribution. The optimization process starts by determining the gadolinium distribution based on the pin power distribution of a flat enrichment lattice and some heuristic rules. Next, a pin power distribution flattening and an enrichment grouping process are introduced to determine the enrichment of each fuel pin enrichment type and the initial enrichment distribution of a fuel lattice design. Finally, enrichment group peaking equalizing processes are performed to achieve lower lattice peaking. Several fuel lattice design constraints are also incorporated in the automation system such that the system can accomplish a design which meets the requirements of practical use. Depending on the axial position of the lattice, a different method is applied in the design of the fuel lattice. Two typical fuel lattices with U"2"3"5 enrichment of 4.471% and 4.386% were taken as references. Application of the method demonstrates that improved lattice designs can be achieved through the enrichment grouping and the enrichment group peaking equalizing method. It takes about 11 min and 1 h 11 min of CPU time for the automation system to accomplish two design cases on an HP-8000 workstation, including the execution of CASMO-4 lattice

  17. Fuel lattice design in a boiling water reactor using a knowledge-based automation system

    Energy Technology Data Exchange (ETDEWEB)

    Tung, Wu-Hsiung, E-mail: wstong@iner.gov.tw; Lee, Tien-Tso; Kuo, Weng-Sheng; Yaur, Shung-Jung

    2015-11-15

    Highlights: • An automation system was developed for the fuel lattice radial design of BWRs. • An enrichment group peaking equalizing method is applied to optimize the design. • Several heuristic rules and restrictions are incorporated to facilitate the design. • The CPU time for the system to design a 10x10 lattice was less than 1.2 h. • The beginning-of-life LPF was improved from 1.319 to 1.272 for one of the cases. - Abstract: A knowledge-based fuel lattice design automation system for BWRs is developed and applied to the design of 10 × 10 fuel lattices. The knowledge implemented in this fuel lattice design automation system includes the determination of gadolinium fuel pin location, the determination of fuel pin enrichment and enrichment distribution. The optimization process starts by determining the gadolinium distribution based on the pin power distribution of a flat enrichment lattice and some heuristic rules. Next, a pin power distribution flattening and an enrichment grouping process are introduced to determine the enrichment of each fuel pin enrichment type and the initial enrichment distribution of a fuel lattice design. Finally, enrichment group peaking equalizing processes are performed to achieve lower lattice peaking. Several fuel lattice design constraints are also incorporated in the automation system such that the system can accomplish a design which meets the requirements of practical use. Depending on the axial position of the lattice, a different method is applied in the design of the fuel lattice. Two typical fuel lattices with U{sup 235} enrichment of 4.471% and 4.386% were taken as references. Application of the method demonstrates that improved lattice designs can be achieved through the enrichment grouping and the enrichment group peaking equalizing method. It takes about 11 min and 1 h 11 min of CPU time for the automation system to accomplish two design cases on an HP-8000 workstation, including the execution of CASMO-4

  18. Operations planning simulation model extension study. Volume 1: Long duration exposure facility ST-01-A automated payload

    Science.gov (United States)

    Marks, D. A.; Gendiellee, R. E.; Kelly, T. M.; Giovannello, M. A.

    1974-01-01

    Ground processing and operation activities for selected automated and sortie payloads are evaluated. Functional flow activities are expanded to identify payload launch site facility and support requirements. Payload definitions are analyzed from the launch site ground processing viewpoint and then processed through the expanded functional flow activities. The requirements generated from the evaluation are compared with those contained in the data sheets. The following payloads were included in the evaluation: Long Duration Exposure Facility; Life Sciences Shuttle Laboratory; Biomedical Experiments Scientific Satellite; Dedicated Solar Sortie Mission; Magnetic Spectrometer; and Mariner Jupiter Orbiter. The expanded functional flow activities and descriptions for the automated and sortie payloads at the launch site are presented.

  19. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  20. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  1. Recent developments in membrane-based separations in biotechnology processes: review.

    Science.gov (United States)

    Rathore, A S; Shirke, A

    2011-01-01

    Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.

  2. A longitudinal evaluation of performance of automated BCR-ABL1 quantitation using cartridge-based detection system.

    Science.gov (United States)

    Enjeti, Anoop; Granter, Neil; Ashraf, Asma; Fletcher, Linda; Branford, Susan; Rowlings, Philip; Dooley, Susan

    2015-10-01

    An automated cartridge-based detection system (GeneXpert; Cepheid) is being widely adopted in low throughput laboratories for monitoring BCR-ABL1 transcript in chronic myelogenous leukaemia. This Australian study evaluated the longitudinal performance specific characteristics of the automated system.The automated cartridge-based system was compared prospectively with the manual qRT-PCR-based reference method at SA Pathology, Adelaide, over a period of 2.5 years. A conversion factor determination was followed by four re-validations. Peripheral blood samples (n = 129) with international scale (IS) values within detectable range were selected for assessment. The mean bias, proportion of results within specified fold difference (2-, 3- and 5-fold), the concordance rate of major molecular remission (MMR) and concordance across a range of IS values on paired samples were evaluated.The initial conversion factor for the automated system was determined as 0.43. Except for the second re-validation, where a negative bias of 1.9-fold was detected, all other biases fell within desirable limits. A cartridge-specific conversion factor and efficiency value was introduced and the conversion factor was confirmed to be stable in subsequent re-validation cycles. Concordance with the reference method/laboratory at >0.1-≤10 IS was 78.2% and at ≤0.001 was 80%, compared to 86.8% in the >0.01-≤0.1 IS range. The overall and MMR concordance were 85.7% and 94% respectively, for samples that fell within ± 5-fold of the reference laboratory value over the entire period of study.Conversion factor and performance specific characteristics for the automated system were longitudinally stable in the clinically relevant range, following introduction by the manufacturer of lot specific efficiency values.

  3. MIDAS: Automated Approach to Design Microwave Integrated Inductors and Transformers on Silicon

    Directory of Open Access Journals (Sweden)

    L. Aluigi

    2013-09-01

    Full Text Available The design of modern radiofrequency integrated circuits on silicon operating at microwave and millimeter-waves requires the integration of several spiral inductors and transformers that are not commonly available in the process design-kits of the technologies. In this work we present an auxiliary CAD tool for Microwave Inductor (and transformer Design Automation on Silicon (MIDAS that exploits commercial simulators and allows the implementation of an automatic design flow, including three-dimensional layout editing and electromagnetic simulations. In detail, MIDAS allows the designer to derive a preliminary sizing of the inductor (transformer on the bases of the design entries (specifications. It draws the inductor (transformer layers for the specific process design kit, including vias and underpasses, with or without patterned ground shield, and launches the electromagnetic simulations, achieving effective design automation with respect to the traditional design flow for RFICs. With the present software suite the complete design time is reduced significantly (typically 1 hour on a PC based on Intel® Pentium® Dual 1.80GHz CPU with 2-GB RAM. Afterwards both the device equivalent circuit and the layout are ready to be imported in the Cadence environment.

  4. Characterization of the effects of borehole configuration and interference with long term ground temperature modelling of ground source heat pumps

    International Nuclear Information System (INIS)

    Law, Ying Lam E.; Dworkin, Seth B.

    2016-01-01

    Highlights: • Long term ground temperature response is explored using finite element methods. • Simulation method is validated against experimental and analytical data. • Temperature changes at a fast rate in the first few years and slows down gradually. • ASHRAE recommended separation distances are not always sufficient. • Thermal accumulation occurs at the centre of borehole field. - Abstract: Ground source heat pumps (GSHPs) are an environmentally friendly alternative to conventional heating and cooling systems because of their high efficiency and low greenhouse gas emissions. The ground acts as a heat sink/source for the excess/required heat inside a building for cooling and heating modes, respectively. However, imbalance in heating and cooling needs can change ground temperature over the operating duration. This increase/decrease in ground temperature lowers system efficiency and causes the ground to foul—failing to accept or provide more heat. In order to ensure that GSHPs can operate to their designed conditions, thermal modelling is required to simulate the ground temperature during system operation. In addition, the borehole field layout can have a major impact on ground temperature. In this study, four buildings were studied—a hospital, fast-food restaurant, residence, and school, each with varying borehole configurations. Boreholes were modelled in a soil volume using finite-element methods and heating and cooling fluxes were applied to the borehole walls to simulate the GSHP operation. 20 years of operation were modelled for each building for 2 × 2, 4 × 4, and 2 × 8 borehole configurations. Results indicate that the borehole separation distance of 6 m, recommended by ASHRAE, is not always sufficient to prevent borehole thermal interactions. Benefits of using a 2 × 8 configuration as opposed to a 4 × 4 configuration, which can be observed because of the larger perimeter it provides for heat to dissipate to surrounding soil were

  5. Automated Prediction of Catalytic Mechanism and Rate Law Using Graph-Based Reaction Path Sampling.

    Science.gov (United States)

    Habershon, Scott

    2016-04-12

    In a recent article [ J. Chem. Phys. 2015 , 143 , 094106 ], we introduced a novel graph-based sampling scheme which can be used to generate chemical reaction paths in many-atom systems in an efficient and highly automated manner. The main goal of this work is to demonstrate how this approach, when combined with direct kinetic modeling, can be used to determine the mechanism and phenomenological rate law of a complex catalytic cycle, namely cobalt-catalyzed hydroformylation of ethene. Our graph-based sampling scheme generates 31 unique chemical products and 32 unique chemical reaction pathways; these sampled structures and reaction paths enable automated construction of a kinetic network model of the catalytic system when combined with density functional theory (DFT) calculations of free energies and resultant transition-state theory rate constants. Direct simulations of this kinetic network across a range of initial reactant concentrations enables determination of both the reaction mechanism and the associated rate law in an automated fashion, without the need for either presupposing a mechanism or making steady-state approximations in kinetic analysis. Most importantly, we find that the reaction mechanism which emerges from these simulations is exactly that originally proposed by Heck and Breslow; furthermore, the simulated rate law is also consistent with previous experimental and computational studies, exhibiting a complex dependence on carbon monoxide pressure. While the inherent errors of using DFT simulations to model chemical reactivity limit the quantitative accuracy of our calculated rates, this work confirms that our automated simulation strategy enables direct analysis of catalytic mechanisms from first principles.

  6. FINDING EXTRATERRESTRIAL LIFE USING GROUND-BASED HIGH-DISPERSION SPECTROSCOPY

    International Nuclear Information System (INIS)

    Snellen, I. A. G.; Le Poole, R.; Brogi, M.; Birkby, J.; De Kok, R. J.

    2013-01-01

    Exoplanet observations promise one day to unveil the presence of extraterrestrial life. Atmospheric compounds in strong chemical disequilibrium would point to large-scale biological activity just as oxygen and methane do in the Earth's atmosphere. The cancellation of both the Terrestrial Planet Finder and Darwin missions means that it is unlikely that a dedicated space telescope to search for biomarker gases in exoplanet atmospheres will be launched within the next 25 years. Here we show that ground-based telescopes provide a strong alternative for finding biomarkers in exoplanet atmospheres through transit observations. Recent results on hot Jupiters show the enormous potential of high-dispersion spectroscopy to separate the extraterrestrial and telluric signals, making use of the Doppler shift of the planet. The transmission signal of oxygen from an Earth-twin orbiting a small red dwarf star is only a factor of three smaller than that of carbon monoxide recently detected in the hot Jupiter τ Boötis b, albeit such a star will be orders of magnitude fainter. We show that if Earth-like planets are common, the planned extremely large telescopes can detect oxygen within a few dozen transits. Ultimately, large arrays of dedicated flux-collector telescopes equipped with high-dispersion spectrographs can provide the large collecting area needed to perform a statistical study of life-bearing planets in the solar neighborhood.

  7. Mycological evaluation of a ground cocoa-based beverage ...

    African Journals Online (AJOL)

    Cocoa beans (Theobroma cacao) are processed into cocoa beverage through fermentation, drying, roasting and grounding of the seed to powder. The mycological quality of 39 samples of different brand of these cocoa – based beverage referred to as 'eruku oshodi' collected from 3 different markets in south – west Nigeria ...

  8. Qualification of academic facilities for small-scale automated manufacture of autologous cell-based products.

    Science.gov (United States)

    Hourd, Paul; Chandra, Amit; Alvey, David; Ginty, Patrick; McCall, Mark; Ratcliffe, Elizabeth; Rayment, Erin; Williams, David J

    2014-01-01

    Academic centers, hospitals and small companies, as typical development settings for UK regenerative medicine assets, are significant contributors to the development of autologous cell-based therapies. Often lacking the appropriate funding, quality assurance heritage or specialist regulatory expertise, qualifying aseptic cell processing facilities for GMP compliance is a significant challenge. The qualification of a new Cell Therapy Manufacturing Facility with automated processing capability, the first of its kind in a UK academic setting, provides a unique demonstrator for the qualification of small-scale, automated facilities for GMP-compliant manufacture of autologous cell-based products in these settings. This paper shares our experiences in qualifying the Cell Therapy Manufacturing Facility, focusing on our approach to streamlining the qualification effort, the challenges, project delays and inefficiencies we encountered, and the subsequent lessons learned.

  9. Quantitative Estimation of Above Ground Crop Biomass using Ground-based, Airborne and Spaceborne Low Frequency Polarimetric Synthetic Aperture Radar

    Science.gov (United States)

    Koyama, C.; Watanabe, M.; Shimada, M.

    2016-12-01

    Estimation of crop biomass is one of the important challenges in environmental remote sensing related to agricultural as well as hydrological and meteorological applications. Usually passive optical data (photographs, spectral data) operating in the visible and near-infrared bands is used for such purposes. The virtue of optical remote sensing for yield estimation, however, is rather limited as the visible light can only provide information about the chemical characteristics of the canopy surface. Low frequency microwave signals with wavelength longer 20 cm have the potential to penetrate through the canopy and provide information about the whole vertical structure of vegetation from the top of the canopy down to the very soil surface. This phenomenon has been well known and exploited to detect targets under vegetation in the military radar application known as FOPEN (foliage penetration). With the availability of polarimetric interferometric SAR data the use PolInSAR techniques to retrieve vertical vegetation structures has become an attractive tool. However, PolInSAR is still highly experimental and suitable data is not yet widely available. In this study we focus on the use of operational dual-polarization L-band (1.27 GHz) SAR which is since the launch of Japan's Advanced Land Observing Satellite (ALOS, 2006-2011) available worldwide. Since 2014 ALOS-2 continues to deliver such kind of partial polarimetric data for the entire land surface. In addition to these spaceborne data sets we use airborne L-band SAR data acquired by the Japanese Pi-SAR-L2 as well as ultra-wideband (UWB) ground based SAR data operating in the frequency range from 1-4 GHz. By exploiting the complex dual-polarization [C2] Covariance matrix information, the scattering contributions from the canopy can be well separated from the ground reflections allowing for the establishment of semi-empirical relationships between measured radar reflectivity and the amount of fresh-weight above-ground

  10. The use of automated assessments in internet-based CBT: The computer will be with you shortly

    Directory of Open Access Journals (Sweden)

    Elizabeth C. Mason

    2014-10-01

    Full Text Available There is evidence from randomized control trials that internet-based cognitive behavioral therapy (iCBT is efficacious in the treatment of anxiety and depression, and recent research demonstrates the effectiveness of iCBT in routine clinical care. The aims of this study were to implement and evaluate a new pathway by which patients could access online treatment by completing an automated assessment, rather than seeing a specialist health professional. We compared iCBT treatment outcomes in patients who received an automated pre-treatment questionnaire assessment with patients who were assessed by a specialist psychiatrist prior to treatment. Participants were treated as part of routine clinical care and were therefore not randomized. The results showed that symptoms of anxiety and depression decreased significantly with iCBT, and that the mode of assessment did not affect outcome. That is, a pre-treatment assessment by a psychiatrist conferred no additional treatment benefits over an automated assessment. These findings suggest that iCBT is effective in routine care and may be implemented with an automated assessment. By providing wider access to evidence-based interventions and reducing waiting times, the use of iCBT within a stepped-care model is a cost-effective way to reduce the burden of disease caused by these common mental disorders.

  11. Figure-ground segregation can rely on differences in motion direction.

    Science.gov (United States)

    Kandil, Farid I; Fahle, Manfred

    2004-12-01

    If the elements within a figure move synchronously while those in the surround move at a different time, the figure is easily segregated from the surround and thus perceived. Lee and Blake (1999) [Visual form created solely from temporal structure. Science, 284, 1165-1168] demonstrated that this figure-ground separation may be based not only on time differences between motion onsets, but also on the differences between reversals of motion direction. However, Farid and Adelson (2001) [Synchrony does not promote grouping in temporally structured displays. Nature Neuroscience, 4, 875-876] argued that figure-ground segregation in the motion-reversal experiment might have been based on a contrast artefact and concluded that (a)synchrony as such was 'not responsible for the perception of form in these or earlier displays'. Here, we present experiments that avoid contrast artefacts but still produce figure-ground segregation based on purely temporal cues. Our results show that subjects can segregate figure from ground even though being unable to use motion reversals as such. Subjects detect the figure when either (i) motion stops (leading to contrast artefacts), or (ii) motion directions differ between figure and ground. Segregation requires minimum delays of about 15 ms. We argue that whatever the underlying cues and mechanisms, a second stage beyond motion detection is required to globally compare the outputs of local motion detectors and to segregate figure from ground. Since analogous changes take place in both figure and ground in rapid succession, this second stage has to detect the asynchrony with high temporal precision.

  12. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    Science.gov (United States)

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  13. Automation, robotics, and inflight training for manned Mars missions

    Science.gov (United States)

    Holt, Alan C.

    1986-01-01

    The automation, robotics, and inflight training requirements of manned Mars missions will be supported by similar capabilities developed for the space station program. Evolutionary space station onboard training facilities will allow the crewmembers to minimize the amount of training received on the ground by providing extensive onboard access to system and experiment malfunction procedures, maintenance procedures, repair procedures, and associated video sequences. Considerable on-the-job training will also be conducted for space station management, mobile remote manipulator operations, proximity operations with the Orbital Maneuvering Vehicle (and later the Orbit Transfer Vehicle), and telerobotics and mobile robots. A similar approach could be used for manned Mars mission training with significant additions such as high fidelity image generation and simulation systems such as holographic projection systems for Mars landing, ascent, and rendezvous training. In addition, a substantial increase in the use of automation and robotics for hazardous and tedious tasks would be expected for Mars mission. Mobile robots may be used to assist in the assembly, test and checkout of the Mars spacecraft, in the handling of nuclear components and hazardous chemical propellent transfer operations, in major spacecraft repair tasks which might be needed (repair of a micrometeroid penetration, for example), in the construction of a Mars base, and for routine maintenance of the base when unmanned.

  14. Identifying Requirements for Effective Human-Automation Teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; John O' Hara; Heather D. Medema; Johanna H. Oxstrand

    2014-06-01

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based on a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.

  15. Single-column extraction chromatographic separation of U, Pu, Np and Am

    Energy Technology Data Exchange (ETDEWEB)

    Morgenstern, A.; Apostolidis, C.; Carlos-Marquez, R.; Mayer, K.; Molinet, R. [Commission of the European Communities, Karlsruhe (Germany). European Inst. for Transuranium Elements

    2002-07-01

    A rapid, single-column extraction chromatographic method using commercially available UTEVA resin has been developed for the separation of uranium, plutonium, neptunium and americium. The method yields recoveries superior to 90% and allows direct loading of separated fractions on filaments for subsequent analysis by thermal ionization mass spectrometry. The use of reagents compatible with robotized equipment allows automation of the separation process for routine analysis of nuclear materials. The redox reactions between plutonium, neptunium and hydrogen peroxide involved in the separation process were studied by UV/Vis/NIR absorption spectroscopy. (orig.)

  16. Combination of Complex-Based and Magnitude-Based Multiecho Water-Fat Separation for Accurate Quantification of Fat-Fraction

    Science.gov (United States)

    Yu, Huanzhou; Shimakawa, Ann; Hines, Catherine D. G.; McKenzie, Charles A.; Hamilton, Gavin; Sirlin, Claude B.; Brittain, Jean H.; Reeder, Scott B.

    2011-01-01

    Multipoint water–fat separation techniques rely on different water–fat phase shifts generated at multiple echo times to decompose water and fat. Therefore, these methods require complex source images and allow unambiguous separation of water and fat signals. However, complex-based water–fat separation methods are sensitive to phase errors in the source images, which may lead to clinically important errors. An alternative approach to quantify fat is through “magnitude-based” methods that acquire multiecho magnitude images. Magnitude-based methods are insensitive to phase errors, but cannot estimate fat-fraction greater than 50%. In this work, we introduce a water–fat separation approach that combines the strengths of both complex and magnitude reconstruction algorithms. A magnitude-based reconstruction is applied after complex-based water–fat separation to removes the effect of phase errors. The results from the two reconstructions are then combined. We demonstrate that using this hybrid method, 0–100% fat-fraction can be estimated with improved accuracy at low fat-fractions. PMID:21695724

  17. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis......We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler...

  18. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  19. Automated diabetic retinopathy detection in smartphone-based fundus photography using artificial intelligence.

    Science.gov (United States)

    Rajalakshmi, Ramachandran; Subashini, Radhakrishnan; Anjana, Ranjit Mohan; Mohan, Viswanathan

    2018-06-01

    To assess the role of artificial intelligence (AI)-based automated software for detection of diabetic retinopathy (DR) and sight-threatening DR (STDR) by fundus photography taken using a smartphone-based device and validate it against ophthalmologist's grading. Three hundred and one patients with type 2 diabetes underwent retinal photography with Remidio 'Fundus on phone' (FOP), a smartphone-based device, at a tertiary care diabetes centre in India. Grading of DR was performed by the ophthalmologists using International Clinical DR (ICDR) classification scale. STDR was defined by the presence of severe non-proliferative DR, proliferative DR or diabetic macular oedema (DME). The retinal photographs were graded using a validated AI DR screening software (EyeArt TM ) designed to identify DR, referable DR (moderate non-proliferative DR or worse and/or DME) or STDR. The sensitivity and specificity of automated grading were assessed and validated against the ophthalmologists' grading. Retinal images of 296 patients were graded. DR was detected by the ophthalmologists in 191 (64.5%) and by the AI software in 203 (68.6%) patients while STDR was detected in 112 (37.8%) and 146 (49.3%) patients, respectively. The AI software showed 95.8% (95% CI 92.9-98.7) sensitivity and 80.2% (95% CI 72.6-87.8) specificity for detecting any DR and 99.1% (95% CI 95.1-99.9) sensitivity and 80.4% (95% CI 73.9-85.9) specificity in detecting STDR with a kappa agreement of k = 0.78 (p < 0.001) and k = 0.75 (p < 0.001), respectively. Automated AI analysis of FOP smartphone retinal imaging has very high sensitivity for detecting DR and STDR and thus can be an initial tool for mass retinal screening in people with diabetes.

  20. Automated coating procedures to produce poly(ethylene glycol) brushes in fused-silica capillaries

    DEFF Research Database (Denmark)

    Poulsen, Nicklas N.; Østergaard, Jesper; Petersen, Nickolaj J.

    2017-01-01

    . Flexible and reliable approaches for preventing unwanted protein adsorption in separation science are thus in high demand. We therefore present new coating approaches based on an automated in-capillary surface initiated atom transfer radical polymerization process (covalent coating) as well...... as by electrostatically adsorbing a pre-synthesized polymer leading to functionalized molecular brushes. The electroosmotic flow was measured following each step of the covalent coating procedure providing a detailed characterization and quality control. Both approaches resulted in good fouling resistance against...