WorldWideScience

Sample records for mapmi automated mapping

  1. System of automated map design

    International Nuclear Information System (INIS)

    Ponomarjov, S.Yu.; Rybalko, S.I.; Proskura, N.I.

    1992-01-01

    Preprint 'System of automated map design' contains information about the program shell for construction of territory map, performing level line drawing of arbitrary two-dimension field (in particular, the radionuclide concentration field). The work schedule and data structures are supplied, as well as data on system performance. The preprint can become useful for experts in radioecology and for all persons involved in territory pollution mapping or multi-purpose geochemical mapping. (author)

  2. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  3. Automated mapping of soybean and corn using phenology

    Science.gov (United States)

    Zhong, Liheng; Hu, Lina; Yu, Le; Gong, Peng; Biging, Gregory S.

    2016-09-01

    For the two of the most important agricultural commodities, soybean and corn, remote sensing plays a substantial role in delivering timely information on the crop area for economic, environmental and policy studies. Traditional long-term mapping of soybean and corn is challenging as a result of the high cost of repeated training data collection, the inconsistency in image process and interpretation, and the difficulty of handling the inter-annual variability of weather and crop progress. In this study, we developed an automated approach to map soybean and corn in the state of Paraná, Brazil for crop years 2010-2015. The core of the approach is a decision tree classifier with rules manually built based on expert interaction for repeated use. The automated approach is advantageous for its capacity of multi-year mapping without the need to re-train or re-calibrate the classifier. Time series MODerate-resolution Imaging Spectroradiometer (MODIS) reflectance product (MCD43A4) were employed to derive vegetation phenology to identify soybean and corn based on crop calendar. To deal with the phenological similarity between soybean and corn, the surface reflectance of the shortwave infrared band scaled to a phenological stage was used to fully separate the two crops. Results suggested that the mapped areas of soybean and corn agreed with official statistics at the municipal level. The resultant map in the crop year 2012 was evaluated using an independent reference data set, and the overall accuracy and Kappa coefficient were 87.2% and 0.804 respectively. As a result of mixed pixel effect at the 500 m resolution, classification results were biased depending on topography. In the flat, broad and highly-cropped areas, uncultivated lands were likely to be identified as soybean or corn, causing over-estimation of cropland area. By contrast, scattered crop fields in mountainous regions with dense natural vegetation tend to be overlooked. For future mapping efforts, it has great

  4. An automated approach to mapping corn from Landsat imagery

    Science.gov (United States)

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  5. Automated grain mapping using wide angle convergent beam electron diffraction in transmission electron microscope for nanomaterials.

    Science.gov (United States)

    Kumar, Vineet

    2011-12-01

    The grain size statistics, commonly derived from the grain map of a material sample, are important microstructure characteristics that greatly influence its properties. The grain map for nanomaterials is usually obtained manually by visual inspection of the transmission electron microscope (TEM) micrographs because automated methods do not perform satisfactorily. While the visual inspection method provides reliable results, it is a labor intensive process and is often prone to human errors. In this article, an automated grain mapping method is developed using TEM diffraction patterns. The presented method uses wide angle convergent beam diffraction in the TEM. The automated technique was applied on a platinum thin film sample to obtain the grain map and subsequently derive grain size statistics from it. The grain size statistics obtained with the automated method were found in good agreement with the visual inspection method.

  6. Quantitative DMS mapping for automated RNA secondary structure inference

    OpenAIRE

    Cordero, Pablo; Kladwang, Wipapat; VanLang, Christopher C.; Das, Rhiju

    2012-01-01

    For decades, dimethyl sulfate (DMS) mapping has informed manual modeling of RNA structure in vitro and in vivo. Here, we incorporate DMS data into automated secondary structure inference using a pseudo-energy framework developed for 2'-OH acylation (SHAPE) mapping. On six non-coding RNAs with crystallographic models, DMS- guided modeling achieves overall false negative and false discovery rates of 9.5% and 11.6%, comparable or better than SHAPE-guided modeling; and non-parametric bootstrappin...

  7. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  8. Open-Source Automated Mapping Four-Point Probe

    Directory of Open Access Journals (Sweden)

    Handy Chandra

    2017-01-01

    Full Text Available Scientists have begun using self-replicating rapid prototyper (RepRap 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  9. Open-Source Automated Mapping Four-Point Probe.

    Science.gov (United States)

    Chandra, Handy; Allen, Spencer W; Oberloier, Shane W; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M

    2017-01-26

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  10. Automated detection of qualitative spatio-temporal features in electrocardiac activation maps.

    Science.gov (United States)

    Ironi, Liliana; Tentoni, Stefania

    2007-02-01

    This paper describes a piece of work aiming at the realization of a tool for the automated interpretation of electrocardiac maps. Such maps can capture a number of electrical conduction pathologies, such as arrhytmia, that can be missed by the analysis of traditional electrocardiograms. But, their introduction into the clinical practice is still far away as their interpretation requires skills that belongs to very few experts. Then, an automated interpretation tool would bridge the gap between the established research outcome and clinical practice with a consequent great impact on health care. Qualitative spatial reasoning can play a crucial role in the identification of spatio-temporal patterns and salient features that characterize the heart electrical activity. We adopted the spatial aggregation (SA) conceptual framework and an interplay of numerical and qualitative information to extract features from epicardial maps, and to make them available for reasoning tasks. Our focus is on epicardial activation isochrone maps as they are a synthetic representation of spatio-temporal aspects of the propagation of the electrical excitation. We provide a computational SA-based methodology to extract, from 3D epicardial data gathered over time, (1) the excitation wavefront structure, and (2) the salient features that characterize wavefront propagation and visually correspond to specific geometric objects. The proposed methodology provides a robust and efficient way to identify salient pieces of information in activation time maps. The hierarchical structure of the abstracted geometric objects, crucial in capturing the prominent information, facilitates the definition of general rules necessary to infer the correlation between pathophysiological patterns and wavefront structure and propagation.

  11. Automated mapping of persistent ice and snow cover across the western U.S. with Landsat

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We implemented an automated approach for mapping persistent ice and snow cover (PISC) across the conterminous western U.S. using all available Landsat TM and ETM+ scenes acquired during the late summer/early fall period between 2010 and 2014. Two separate validation approaches indicate this dataset provides a more accurate representation of glacial ice and perennial snow cover for the region than either the U.S. glacier database derived from US Geological Survey (USGS) Digital Raster Graphics (DRG) maps (based on aerial photography primarily from the 1960s–1980s) or the National Land Cover Database 2011 perennial ice and snow cover class. Our 2010–2014 Landsat-derived dataset indicates 28% less glacier and perennial snow cover than the USGS DRG dataset. There are larger differences between the datasets in some regions, such as the Rocky Mountains of Northwest Wyoming and Southwest Montana, where the Landsat dataset indicates 54% less PISC area. Analysis of Landsat scenes from 1987–1988 and 2008–2010 for three regions using a more conventional, semi-automated approach indicates substantial decreases in glaciers and perennial snow cover that correlate with differences between PISC mapped by the USGS DRG dataset and the automated Landsat-derived dataset. This suggests that most of the differences in PISC between the USGS DRG and the Landsat-derived dataset can be attributed to decreases in PISC, as opposed to differences between mapping techniques. While the dataset produced by the automated Landsat mapping approach is not designed to serve as a conventional glacier inventory that provides glacier outlines and attribute information, it allows for an updated estimate of PISC for the conterminous U.S. as well as for smaller regions. Additionally, the new dataset highlights areas where decreases in PISC have been most significant over the past 25–50 years.

  12. Improving automated disturbance maps using snow-covered landsat time series stacks

    Science.gov (United States)

    Kirk M. Stueve; Ian W. Housman; Patrick L. Zimmerman; Mark D. Nelson; Jeremy Webb; Charles H. Perry; Robert A. Chastain; Dale D. Gormanson; Chengquan Huang; Sean P. Healey; Warren B. Cohen

    2012-01-01

    Snow-covered winter Landsat time series stacks are used to develop a nonforest mask to enhance automated disturbance maps produced by the Vegetation Change Tracker (VCT). This method exploits the enhanced spectral separability between forested and nonforested areas that occurs with sufficient snow cover. This method resulted in significant improvements in Vegetation...

  13. Towards automated statewide land cover mapping in Wisconsin using satellite remote sensing and GIS techniques

    International Nuclear Information System (INIS)

    Cosentino, B.L.; Lillesand, T.M.

    1991-01-01

    Attention is given to an initial research project being performed by the University of Wisconsin-Madison, Environmental Remote Sensing Center in conjunction with seven local, state, and federal agencies to implement automated statewide land cover mapping using satellite remote sensing and geographical information system (GIS) techniques. The basis, progress, and future research needs for this mapping program are presented. The research efforts are directed toward strategies that integrate satellite remote sensing and GIS techniques in the generation of land cover data for multiple users of land cover information. The project objectives are to investigate methodologies that integrate satellite data with other imagery and spatial data resident in emerging GISs in the state for particular program needs, and to develop techniques that can improve automated land cover mapping efficiency and accuracy. 10 refs

  14. Comparison of Manual Mapping and Automated Object-Based Image Analysis of Non-Submerged Aquatic Vegetation from Very-High-Resolution UAS Images

    Directory of Open Access Journals (Sweden)

    Eva Husson

    2016-09-01

    Full Text Available Aquatic vegetation has important ecological and regulatory functions and should be monitored in order to detect ecosystem changes. Field data collection is often costly and time-consuming; remote sensing with unmanned aircraft systems (UASs provides aerial images with sub-decimetre resolution and offers a potential data source for vegetation mapping. In a manual mapping approach, UAS true-colour images with 5-cm-resolution pixels allowed for the identification of non-submerged aquatic vegetation at the species level. However, manual mapping is labour-intensive, and while automated classification methods are available, they have rarely been evaluated for aquatic vegetation, particularly at the scale of individual vegetation stands. We evaluated classification accuracy and time-efficiency for mapping non-submerged aquatic vegetation at three levels of detail at five test sites (100 m × 100 m differing in vegetation complexity. We used object-based image analysis and tested two classification methods (threshold classification and Random Forest using eCognition®. The automated classification results were compared to results from manual mapping. Using threshold classification, overall accuracy at the five test sites ranged from 93% to 99% for the water-versus-vegetation level and from 62% to 90% for the growth-form level. Using Random Forest classification, overall accuracy ranged from 56% to 94% for the growth-form level and from 52% to 75% for the dominant-taxon level. Overall classification accuracy decreased with increasing vegetation complexity. In test sites with more complex vegetation, automated classification was more time-efficient than manual mapping. This study demonstrated that automated classification of non-submerged aquatic vegetation from true-colour UAS images was feasible, indicating good potential for operative mapping of aquatic vegetation. When choosing the preferred mapping method (manual versus automated the desired level of

  15. An algorithm for automated layout of process description maps drawn in SBGN.

    Science.gov (United States)

    Genc, Begum; Dogrusoz, Ugur

    2016-01-01

    Evolving technology has increased the focus on genomics. The combination of today's advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  16. Automated, non-linear registration between 3-dimensional brain map and medical head image

    International Nuclear Information System (INIS)

    Mizuta, Shinobu; Urayama, Shin-ichi; Zoroofi, R.A.; Uyama, Chikao

    1998-01-01

    In this paper, we propose an automated, non-linear registration method between 3-dimensional medical head image and brain map in order to efficiently extract the regions of interest. In our method, input 3-dimensional image is registered into a reference image extracted from a brain map. The problems to be solved are automated, non-linear image matching procedure, and cost function which represents the similarity between two images. Non-linear matching is carried out by dividing the input image into connected partial regions, transforming the partial regions preserving connectivity among the adjacent images, evaluating the image similarity between the transformed regions of the input image and the correspondent regions of the reference image, and iteratively searching the optimal transformation of the partial regions. In order to measure the voxelwise similarity of multi-modal images, a cost function is introduced, which is based on the mutual information. Some experiments using MR images presented the effectiveness of the proposed method. (author)

  17. Fast and Accurate Approaches for Large-Scale, Automated Mapping of Food Diaries on Food Composition Tables

    Directory of Open Access Journals (Sweden)

    Marc Lamarine

    2018-05-01

    Full Text Available Aim of Study: The use of weighed food diaries in nutritional studies provides a powerful method to quantify food and nutrient intakes. Yet, mapping these records onto food composition tables (FCTs is a challenging, time-consuming and error-prone process. Experts make this effort manually and no automation has been previously proposed. Our study aimed to assess automated approaches to map food items onto FCTs.Methods: We used food diaries (~170,000 records pertaining to 4,200 unique food items from the DiOGenes randomized clinical trial. We attempted to map these items onto six FCTs available from the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching. The second used a machine learning approach (C5.0 classifier combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English-translation. Top matching pairs were reviewed manually to derive performance metrics: precision (the percentage of correctly mapped items and recall (percentage of mapped items.Results: The simpler approach: fuzzy matching, provided very good performance. Under a relaxed threshold (score > 50%, this approach enabled to remap 99.49% of the items with a precision of 88.75%. With a slightly more stringent threshold (score > 63%, the precision could be significantly improved to 96.81% while keeping a recall rate > 95% (i.e., only 5% of the queried items would not be mapped. The machine learning approach did not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names. Our approaches have been implemented as R packages and are freely available from GitHub.Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We

  18. Assessing the Agreement Between Eo-Based Semi-Automated Landslide Maps with Fuzzy Manual Landslide Delineation

    Science.gov (United States)

    Albrecht, F.; Hölbling, D.; Friedl, B.

    2017-09-01

    Landslide mapping benefits from the ever increasing availability of Earth Observation (EO) data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA) provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR) satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  19. ASSESSING THE AGREEMENT BETWEEN EO-BASED SEMI-AUTOMATED LANDSLIDE MAPS WITH FUZZY MANUAL LANDSLIDE DELINEATION

    Directory of Open Access Journals (Sweden)

    F. Albrecht

    2017-09-01

    Full Text Available Landslide mapping benefits from the ever increasing availability of Earth Observation (EO data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  20. Snow-covered Landsat time series stacks improve automated disturbance mapping accuracy in forested landscapes

    Science.gov (United States)

    Kirk M. Stueve; Ian W. Housman; Patrick L. Zimmerman; Mark D. Nelson; Jeremy B. Webb; Charles H. Perry; Robert A. Chastain; Dale D. Gormanson; Chengquan Huang; Sean P. Healey; Warren B. Cohen

    2011-01-01

    Accurate landscape-scale maps of forests and associated disturbances are critical to augment studies on biodiversity, ecosystem services, and the carbon cycle, especially in terms of understanding how the spatial and temporal complexities of damage sustained from disturbances influence forest structure and function. Vegetation change tracker (VCT) is a highly automated...

  1. Intelligent Machine Vision for Automated Fence Intruder Detection Using Self-organizing Map

    OpenAIRE

    Veldin A. Talorete Jr.; Sherwin A Guirnaldo

    2017-01-01

    This paper presents an intelligent machine vision for automated fence intruder detection. A series of still captured images that contain fence events using Internet Protocol cameras was used as input data to the system. Two classifiers were used; the first is to classify human posture and the second one will classify intruder location. The system classifiers were implemented using Self-Organizing Map after the implementation of several image segmentation processes. The human posture classifie...

  2. Mapping of brain activity by automated volume analysis of immediate early genes

    Science.gov (United States)

    Renier, Nicolas; Adams, Eliza L.; Kirst, Christoph; Wu, Zhuhao; Azevedo, Ricardo; Kohl, Johannes; Autry, Anita E.; Kadiri, Lolahon; Venkataraju, Kannan Umadevi; Zhou, Yu; Wang, Victoria X.; Tang, Cheuk Y.; Olsen, Olav; Dulac, Catherine; Osten, Pavel; Tessier-Lavigne, Marc

    2016-01-01

    Summary Understanding how neural information is processed in physiological and pathological states would benefit from precise detection, localization and quantification of the activity of all neurons across the entire brain, which has not to date been achieved in the mammalian brain. We introduce a pipeline for high speed acquisition of brain activity at cellular resolution through profiling immediate early gene expression using immunostaining and light-sheet fluorescence imaging, followed by automated mapping and analysis of activity by an open-source software program we term ClearMap. We validate the pipeline first by analysis of brain regions activated in response to Haloperidol. Next, we report new cortical regions downstream of whisker-evoked sensory processing during active exploration. Lastly, we combine activity mapping with axon tracing to uncover new brain regions differentially activated during parenting behavior. This pipeline is widely applicable to different experimental paradigms, including animal species for which transgenic activity reporters are not readily available. PMID:27238021

  3. Automated Identification of Coronal Holes from Synoptic EUV Maps

    Science.gov (United States)

    Hamada, Amr; Asikainen, Timo; Virtanen, Ilpo; Mursula, Kalevi

    2018-04-01

    Coronal holes (CHs) are regions of open magnetic field lines in the solar corona and the source of the fast solar wind. Understanding the evolution of coronal holes is critical for solar magnetism as well as for accurate space weather forecasts. We study the extreme ultraviolet (EUV) synoptic maps at three wavelengths (195 Å/193 Å, 171 Å and 304 Å) measured by the Solar and Heliospheric Observatory/Extreme Ultraviolet Imaging Telescope (SOHO/EIT) and the Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) instruments. The two datasets are first homogenized by scaling the SDO/AIA data to the SOHO/EIT level by means of histogram equalization. We then develop a novel automated method to identify CHs from these homogenized maps by determining the intensity threshold of CH regions separately for each synoptic map. This is done by identifying the best location and size of an image segment, which optimally contains portions of coronal holes and the surrounding quiet Sun allowing us to detect the momentary intensity threshold. Our method is thus able to adjust itself to the changing scale size of coronal holes and to temporally varying intensities. To make full use of the information in the three wavelengths we construct a composite CH distribution, which is more robust than distributions based on one wavelength. Using the composite CH dataset we discuss the temporal evolution of CHs during the Solar Cycles 23 and 24.

  4. Mind the gap! Automated concept map feedback supports students in writing cohesive explanations.

    Science.gov (United States)

    Lachner, Andreas; Burkhart, Christian; Nückles, Matthias

    2017-03-01

    Many students are challenged with the demand of writing cohesive explanations. To support students in writing cohesive explanations, we developed a computer-based feedback tool that visualizes cohesion deficits of students' explanations in a concept map. We conducted three studies to investigate the effectiveness of such feedback as well as the underlying cognitive processes. In Study 1, we found that the concept map helped students identify potential cohesion gaps in their drafts and plan remedial revisions. In Study 2, students with concept map feedback conducted revisions that resulted in more locally and globally cohesive, and also more comprehensible, explanations than the explanations of students who revised without concept map feedback. In Study 3, we replicated the findings of Study 2 by and large. More importantly, students who had received concept map feedback on a training explanation 1 week later wrote a transfer explanation without feedback that was more cohesive than the explanation of students who had received no feedback on their training explanation. The automated concept map feedback appears to particularly support the evaluation phase of the revision process. Furthermore, the feedback enabled novice writers to acquire sustainable skills in writing cohesive explanations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. LC-MS/MS Peptide Mapping with Automated Data Processing for Routine Profiling of N-Glycans in Immunoglobulins

    Science.gov (United States)

    Shah, Bhavana; Jiang, Xinzhao Grace; Chen, Louise; Zhang, Zhongqi

    2014-06-01

    Protein N-Glycan analysis is traditionally performed by high pH anion exchange chromatography (HPAEC), reversed phase liquid chromatography (RPLC), or hydrophilic interaction liquid chromatography (HILIC) on fluorescence-labeled glycans enzymatically released from the glycoprotein. These methods require time-consuming sample preparations and do not provide site-specific glycosylation information. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) peptide mapping is frequently used for protein structural characterization and, as a bonus, can potentially provide glycan profile on each individual glycosylation site. In this work, a recently developed glycopeptide fragmentation model was used for automated identification, based on their MS/MS, of N-glycopeptides from proteolytic digestion of monoclonal antibodies (mAbs). Experimental conditions were optimized to achieve accurate profiling of glycoforms. Glycan profiles obtained from LC-MS/MS peptide mapping were compared with those obtained from HPAEC, RPLC, and HILIC analyses of released glycans for several mAb molecules. Accuracy, reproducibility, and linearity of the LC-MS/MS peptide mapping method for glycan profiling were evaluated. The LC-MS/MS peptide mapping method with fully automated data analysis requires less sample preparation, provides site-specific information, and may serve as an alternative method for routine profiling of N-glycans on immunoglobulins as well as other glycoproteins with simple N-glycans.

  6. A new software routine that automates the fitting of protein X-ray crystallographic electron-density maps.

    Science.gov (United States)

    Levitt, D G

    2001-07-01

    The classical approach to building the amino-acid residues into the initial electron-density map requires days to weeks of a skilled investigator's time. Automating this procedure should not only save time, but has the potential to provide a more accurate starting model for input to refinement programs. The new software routine MAID builds the protein structure into the electron-density map in a series of sequential steps. The first step is the fitting of the secondary alpha-helix and beta-sheet structures. These 'fits' are then used to determine the local amino-acid sequence assignment. These assigned fits are then extended through the loop regions and fused with the neighboring sheet or helix. The program was tested on the unaveraged 2.5 A selenomethionine multiple-wavelength anomalous dispersion (SMAD) electron-density map that was originally used to solve the structure of the 291-residue protein human heart short-chain L-3-hydroxyacyl-CoA dehydrogenase (SHAD). Inputting just the map density and the amino-acid sequence, MAID fitted 80% of the residues with an r.m.s.d. error of 0.43 A for the main-chain atoms and 1.0 A for all atoms without any user intervention. When tested on a higher quality 1.9 A SMAD map, MAID correctly fitted 100% (418) of the residues. A major advantage of the MAID fitting procedure is that it maintains ideal bond lengths and angles and constrains phi/psi angles to the appropriate Ramachandran regions. Recycling the output of this new routine through a partial structure-refinement program may have the potential to completely automate the fitting of electron-density maps.

  7. Automated pattern recognition to support geological mapping and exploration target generation: a case study from southern Namibia

    CSIR Research Space (South Africa)

    Eberle, D

    2015-06-01

    Full Text Available This paper demonstrates a methodology for the automatic joint interpretation of high resolution airborne geophysical and space-borne remote sensing data to support geological mapping in a largely automated, fast and objective manner. At the request...

  8. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  9. Automated integration of genomic physical mapping data via parallel simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T.

    1994-06-01

    The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.

  10. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Science.gov (United States)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  11. Automated 3-D Radiation Mapping

    International Nuclear Information System (INIS)

    Tarpinian, J. E.

    1991-01-01

    This work describes an automated radiation detection and imaging system which combines several state-of-the-art technologies to produce a portable but very powerful visualization tool for planning work in radiation environments. The system combines a radiation detection system, a computerized radiation imaging program, and computerized 3-D modeling to automatically locate and measurements are automatically collected and imaging techniques are used to produce colored, 'isodose' images of the measured radiation fields. The isodose lines from the images are then superimposed over the 3-D model of the area. The final display shows the various components in a room and their associated radiation fields. The use of an automated radiation detection system increases the quality of radiation survey obtained measurements. The additional use of a three-dimensional display allows easier visualization of the area and associated radiological conditions than two-dimensional sketches

  12. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    Science.gov (United States)

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. A Fully Automated Classification for Mapping the Annual Cropland Extent

    Science.gov (United States)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  14. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  15. Fast and accurate approaches for large-scale, automated mapping of food diaries on food composition tables

    DEFF Research Database (Denmark)

    Lamarine, Marc; Hager, Jörg; Saris, Wim H M

    2018-01-01

    the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching). The second used a machine learning approach (C5.0 classifier) combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English...... not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names). Our approaches have been implemented as R packages...... and are freely available from GitHub. Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We demonstrate that both high precision and recall can be achieved. Our solutions can be used with any FCT and do not require any programming background...

  16. A multifaceted comparison of ArcGIS and MapMarker for automated geocoding

    Directory of Open Access Journals (Sweden)

    Sanjaya Kumar

    2012-11-01

    Full Text Available Geocoding is increasingly being used for public health surveillance and spatial epidemiology studies. Public health departments in the United States of America (USA often use this approach to investigate disease outbreaks and clusters or assign health records to appropriate geographic units. We evaluated two commonly used geocoding software packages, ArcGIS and MapMarker, for automated geocoding of a large number of residential addresses from health administrative data in New York State, USA to better understand their features, performance and limitations. The comparison was based on three metrics of evaluation: completeness (or match rate, geocode similarity and positional accuracy. Of the 551,798 input addresses, 318,302 (57.7% were geocoded by MapMarker and 420,813 (76.3% by the ArcGIS composite address locator. High similarity between the geocodes assigned by the two methods was found, especially in suburban and urban areas. Among addresses with a distance of greater than 100 m between the geocodes assigned by the two packages, the point assigned by ArcGIS was closer to the associated parcel centroid (“true” location compared with that assigned by MapMarker. In addition, the composite address locator in ArcGIS allows users to fully utilise available reference data, which consequently results in better geocoding results. However, the positional differences found were minimal, and a large majority of addresses were placed on the same locations by both geocoding packages. Using both methods and combining the results can maximise match rates and save the time needed for manual geocoding.

  17. A multifaceted comparison of ArcGIS and MapMarker for automated geocoding.

    Science.gov (United States)

    Kumar, Sanjaya; Liu, Ming; Hwang, Syni-An

    2012-11-01

    Geocoding is increasingly being used for public health surveillance and spatial epidemiology studies. Public health departments in the United States of America (USA) often use this approach to investigate disease outbreaks and clusters or assign health records to appropriate geographic units. We evaluated two commonly used geocoding software packages, ArcGIS and MapMarker, for automated geocoding of a large number of residential addresses from health administrative data in New York State, USA to better understand their features, performance and limitations. The comparison was based on three metrics of evaluation: completeness (or match rate), geocode similarity and positional accuracy. Of the 551,798 input addresses, 318,302 (57.7%) were geocoded by MapMarker and 420,813 (76.3%) by the ArcGIS composite address locator. High similarity between the geocodes assigned by the two methods was found, especially in suburban and urban areas. Among addresses with a distance of greater than 100 m between the geocodes assigned by the two packages, the point assigned by ArcGIS was closer to the associated parcel centroid ("true" location) compared with that assigned by MapMarker. In addition, the composite address locator in ArcGIS allows users to fully utilise available reference data, which consequently results in better geocoding results. However, the positional differences found were minimal, and a large majority of addresses were placed on the same locations by both geocoding packages. Using both methods and combining the results can maximise match rates and save the time needed for manual geocoding.

  18. A new method for automated discontinuity trace mapping on rock mass 3D surface model

    Science.gov (United States)

    Li, Xiaojun; Chen, Jianqin; Zhu, Hehua

    2016-04-01

    This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.

  19. Automated Analysis of 123I-beta-CIT SPECT Images with Statistical Probabilistic Anatomical Mapping

    International Nuclear Information System (INIS)

    Eo, Jae Seon; Lee, Hoyoung; Lee, Jae Sung; Kim, Yu Kyung; Jeon, Bumseok; Lee, Dong Soo

    2014-01-01

    Population-based statistical probabilistic anatomical maps have been used to generate probabilistic volumes of interest for analyzing perfusion and metabolic brain imaging. We investigated the feasibility of automated analysis for dopamine transporter images using this technique and evaluated striatal binding potentials in Parkinson's disease and Wilson's disease. We analyzed 2β-Carbomethoxy-3β-(4- 123 I-iodophenyl)tropane ( 123 I-beta-CIT) SPECT images acquired from 26 people with Parkinson's disease (M:F=11:15,mean age=49±12 years), 9 people with Wilson's disease (M: F=6:3, mean age=26±11 years) and 17 normal controls (M:F=5:12, mean age=39±16 years). A SPECT template was created using striatal statistical probabilistic map images. All images were spatially normalized onto the template, and probability-weighted regional counts in striatal structures were estimated. The binding potential was calculated using the ratio of specific and nonspecific binding activities at equilibrium. Voxel-based comparisons between groups were also performed using statistical parametric mapping. Qualitative assessment showed that spatial normalizations of the SPECT images were successful for all images. The striatal binding potentials of participants with Parkinson's disease and Wilson's disease were significantly lower than those of normal controls. Statistical parametric mapping analysis found statistically significant differences only in striatal regions in both disease groups compared to controls. We successfully evaluated the regional 123 I-beta-CIT distribution using the SPECT template and probabilistic map data automatically. This procedure allows an objective and quantitative comparison of the binding potential, which in this case showed a significantly decreased binding potential in the striata of patients with Parkinson's disease or Wilson's disease

  20. Distributed GIS for automated natural hazard zonation mapping Internet-SMS warning towards sustainable society

    Directory of Open Access Journals (Sweden)

    Devanjan Bhattacharya

    2014-12-01

    Full Text Available Today, open systems are needed for real time analysis and warnings on geo-hazards and over time can be achieved using Open Source Geographical Information System (GIS-based platform such as GeoNode which is being contributed to by developers around the world. To develop on an open source platform is a very vital component for better disaster information management as far as spatial data infrastructures are concerned and this would be extremely vital when huge databases are to be created and consulted regularly for city planning at different scales, particularly satellite images and maps of locations. There is a big need for spatially referenced data creation, analysis, and management. Some of the salient points that this research would be able to definitely contribute with GeoNode, being an open source platform, are facilitating the creation, sharing, and collaborative use of geospatial data. The objective is development of an automated natural hazard zonation system with Internet-short message service (SMS warning utilizing geomatics for sustainable societies. A concept of developing an internet-resident geospatial geohazard warning system has been put forward in this research, which can communicate alerts via SMS. There has been a need to develop an automated integrated system to categorize hazard and issue warning that reaches users directly. At present, no web-enabled warning system exists which can disseminate warning after hazard evaluation at one go and in real time. The objective of this research work has been to formalize a notion of an integrated, independent, generalized, and automated geo-hazard warning system making use of geo-spatial data under popular usage platform. In this paper, a model of an automated geo-spatial hazard warning system has been elaborated. The functionality is to be modular in architecture having GIS-graphical user interface (GUI, input, understanding, rainfall prediction, expert, output, and warning modules. A

  1. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  2. Inferring the most probable maps of underground utilities using Bayesian mapping model

    Science.gov (United States)

    Bilal, Muhammad; Khan, Wasiq; Muggleton, Jennifer; Rustighi, Emiliano; Jenks, Hugo; Pennock, Steve R.; Atkins, Phil R.; Cohn, Anthony

    2018-03-01

    Mapping the Underworld (MTU), a major initiative in the UK, is focused on addressing social, environmental and economic consequences raised from the inability to locate buried underground utilities (such as pipes and cables) by developing a multi-sensor mobile device. The aim of MTU device is to locate different types of buried assets in real time with the use of automated data processing techniques and statutory records. The statutory records, even though typically being inaccurate and incomplete, provide useful prior information on what is buried under the ground and where. However, the integration of information from multiple sensors (raw data) with these qualitative maps and their visualization is challenging and requires the implementation of robust machine learning/data fusion approaches. An approach for automated creation of revised maps was developed as a Bayesian Mapping model in this paper by integrating the knowledge extracted from sensors raw data and available statutory records. The combination of statutory records with the hypotheses from sensors was for initial estimation of what might be found underground and roughly where. The maps were (re)constructed using automated image segmentation techniques for hypotheses extraction and Bayesian classification techniques for segment-manhole connections. The model consisting of image segmentation algorithm and various Bayesian classification techniques (segment recognition and expectation maximization (EM) algorithm) provided robust performance on various simulated as well as real sites in terms of predicting linear/non-linear segments and constructing refined 2D/3D maps.

  3. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    Science.gov (United States)

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  4. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    Directory of Open Access Journals (Sweden)

    Bart Rogiers

    Full Text Available Cone penetration testing (CPT is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  5. Automated cropland mapping of continental Africa using Google Earth Engine cloud computing

    Science.gov (United States)

    Xiong, Jun N.; Thenkabail, Prasad S.; Gumma, Murali Krishna; Teluguntla, Pardhasaradhi G.; Poehnelt, Justin; Congalton, Russell G.; Yadav, Kamini; Thau, David

    2017-01-01

    The automation of agricultural mapping using satellite-derived remotely sensed data remains a challenge in Africa because of the heterogeneous and fragmental landscape, complex crop cycles, and limited access to local knowledge. Currently, consistent, continent-wide routine cropland mapping of Africa does not exist, with most studies focused either on certain portions of the continent or at most a one-time effort at mapping the continent at coarse resolution remote sensing. In this research, we addressed these limitations by applying an automated cropland mapping algorithm (ACMA) that captures extensive knowledge on the croplands of Africa available through: (a) ground-based training samples, (b) very high (sub-meter to five-meter) resolution imagery (VHRI), and (c) local knowledge captured during field visits and/or sourced from country reports and literature. The study used 16-day time-series of Moderate Resolution Imaging Spectroradiometer (MODIS) normalized difference vegetation index (NDVI) composited data at 250-m resolution for the entire African continent. Based on these data, the study first produced accurate reference cropland layers or RCLs (cropland extent/areas, irrigation versus rainfed, cropping intensities, crop dominance, and croplands versus cropland fallows) for the year 2014 that provided an overall accuracy of around 90% for crop extent in different agro-ecological zones (AEZs). The RCLs for the year 2014 (RCL2014) were then used in the development of the ACMA algorithm to create ACMA-derived cropland layers for 2014 (ACL2014). ACL2014 when compared pixel-by-pixel with the RCL2014 had an overall similarity greater than 95%. Based on the ACL2014, the African continent had 296 Mha of net cropland areas (260 Mha cultivated plus 36 Mha fallows) and 330 Mha of gross cropland areas. Of the 260 Mha of net cropland areas cultivated during 2014, 90.6% (236 Mha) was rainfed and just 9.4% (24 Mha) was irrigated. Africa has about 15% of the

  6. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Directory of Open Access Journals (Sweden)

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  7. Automated Snow Extent Mapping Based on Orthophoto Images from Unmanned Aerial Vehicles

    Science.gov (United States)

    Niedzielski, Tomasz; Spallek, Waldemar; Witek-Kasprzak, Matylda

    2018-04-01

    The paper presents the application of the k-means clustering in the process of automated snow extent mapping using orthophoto images generated using the Structure-from-Motion (SfM) algorithm from oblique aerial photographs taken by unmanned aerial vehicle (UAV). A simple classification approach has been implemented to discriminate between snow-free and snow-covered terrain. The procedure uses the k-means clustering and classifies orthophoto images based on the three-dimensional space of red-green-blue (RGB) or near-infrared-red-green (NIRRG) or near-infrared-green-blue (NIRGB) bands. To test the method, several field experiments have been carried out, both in situations when snow cover was continuous and when it was patchy. The experiments have been conducted using three fixed-wing UAVs (swinglet CAM by senseFly, eBee by senseFly, and Birdie by FlyTech UAV) on 10/04/2015, 23/03/2016, and 16/03/2017 within three test sites in the Izerskie Mountains in southwestern Poland. The resulting snow extent maps, produced automatically using the classification method, have been validated against real snow extents delineated through a visual analysis and interpretation offered by human analysts. For the simplest classification setup, which assumes two classes in the k-means clustering, the extent of snow patches was estimated accurately, with areal underestimation of 4.6% (RGB) and overestimation of 5.5% (NIRGB). For continuous snow cover with sparse discontinuities at places where trees or bushes protruded from snow, the agreement between automatically produced snow extent maps and observations was better, i.e. 1.5% (underestimation with RGB) and 0.7-0.9% (overestimation, either with RGB or with NIRRG). Shadows on snow were found to be mainly responsible for the misclassification.

  8. NEPR Benthic Habitat Map 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This benthic habitat map was created from a semi-automated habitat mapping process, using a combination of bathymetry, satellite imagery, aerial imagery and...

  9. Drawing subway maps : a survey

    NARCIS (Netherlands)

    Wolff, A.

    2007-01-01

    This paper deals with automating the drawing of subway maps. There are two features of schematic subway maps that make them different from drawings of other networks such as flow charts or organigrams. First, most schematic subway maps use not only horizontal and vertical lines, but also diagonals.

  10. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    Directory of Open Access Journals (Sweden)

    Mutlu Ozdogan

    Full Text Available In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i creating masks for water, non-forested areas, clouds, and cloud shadows; ii identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR difference image; iii filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission, issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for

  11. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  12. FOLD-EM: automated fold recognition in medium- and low-resolution (4-15 Å) electron density maps.

    Science.gov (United States)

    Saha, Mitul; Morais, Marc C

    2012-12-15

    Owing to the size and complexity of large multi-component biological assemblies, the most tractable approach to determining their atomic structure is often to fit high-resolution radiographic or nuclear magnetic resonance structures of isolated components into lower resolution electron density maps of the larger assembly obtained using cryo-electron microscopy (cryo-EM). This hybrid approach to structure determination requires that an atomic resolution structure of each component, or a suitable homolog, is available. If neither is available, then the amount of structural information regarding that component is limited by the resolution of the cryo-EM map. However, even if a suitable homolog cannot be identified using sequence analysis, a search for structural homologs should still be performed because structural homology often persists throughout evolution even when sequence homology is undetectable, As macromolecules can often be described as a collection of independently folded domains, one way of searching for structural homologs would be to systematically fit representative domain structures from a protein domain database into the medium/low resolution cryo-EM map and return the best fits. Taken together, the best fitting non-overlapping structures would constitute a 'mosaic' backbone model of the assembly that could aid map interpretation and illuminate biological function. Using the computational principles of the Scale-Invariant Feature Transform (SIFT), we have developed FOLD-EM-a computational tool that can identify folded macromolecular domains in medium to low resolution (4-15 Å) electron density maps and return a model of the constituent polypeptides in a fully automated fashion. As a by-product, FOLD-EM can also do flexible multi-domain fitting that may provide insight into conformational changes that occur in macromolecular assemblies.

  13. Mapping the Stacks: Sustainability and User Experience of Animated Maps in Library Discovery Interfaces

    Science.gov (United States)

    McMillin, Bill; Gibson, Sally; MacDonald, Jean

    2016-01-01

    Animated maps of the library stacks were integrated into the catalog interface at Pratt Institute and into the EBSCO Discovery Service interface at Illinois State University. The mapping feature was developed for optimal automation of the update process to enable a range of library personnel to update maps and call-number ranges. The development…

  14. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  15. Automating Mapping Production for the Enterprise: from Contract to Delivery

    Science.gov (United States)

    Uebbing, R.; Xie, C.; Beshah, B.; Welter, J.

    2012-07-01

    The ever increasing volume and quality of geospatial data has created new challenges for mapping companies. Due to increased image resolution, fusion of different data sources and more frequent data update requirements, mapping production is forced to streamline the work flow to meet client deadlines. But the data volume alone is not the only barrier for an efficient production work flow. Processing geospatial information traditionally uses domain and vendor specific applications that do not interface with each other, often leading to data duplication and therefore creating sources for error. Also, it creates isolation between different departments within a mapping company resulting in additional communication barriers. North West Geomatics has designed and implemented a data centric enterprise solution for the flight acquisition and production work flow to combat the above challenges. A central data repository containing not only geospatial data in the strictest sense such as images, vector layers and 3D point clouds, but also other information such as product specifications, client requirements, flight acquisition data, production resource usage and much more has been deployed at the company. As there is only one instance of the database shared throughout the whole organization it allows all employees, given they have been granted the appropriate permission, to view the current status of any project with a graphical and table based interface through its life cycle from sales, through flight acquisition, production and product delivery. Not only can users track progress and status of various work flow steps, but the system also allows users and applications to actively schedule or start specific production steps such as data ingestion and triangulation with many other steps (orthorectification, mosaicing, accounting, etc.) in the planning stages. While the complete system is exposed to the users through a web interface and therefore allowing outside customers to

  16. Moving from proprietary to open-source solutions for academic research in remote sensing: Example with semi-automated land cover mapping

    OpenAIRE

    Grippa, Taïs

    2017-01-01

    GRASS GIS has recently experienced significant improvements for Object-Based Image Analysis. At ULB the choice was made to combine GRASS GIS and Python in a semi-automated processing chain for land-cover mapping. The later proved its ability of being quickly customized in order to match the requirements of different projects. In order to promote the OSGEO software, we decided to make it freely available, allowing anyone interested to review, reuse and/or enhance it for further studies.

  17. Intelligent Machine Vision for Automated Fence Intruder Detection Using Self-organizing Map

    Directory of Open Access Journals (Sweden)

    Veldin A. Talorete Jr.

    2017-03-01

    Full Text Available This paper presents an intelligent machine vision for automated fence intruder detection. A series of still captured images that contain fence events using Internet Protocol cameras was used as input data to the system. Two classifiers were used; the first is to classify human posture and the second one will classify intruder location. The system classifiers were implemented using Self-Organizing Map after the implementation of several image segmentation processes. The human posture classifier is in charge of classifying the detected subject’s posture patterns from subject’s silhouette. Moreover, the Intruder Localization Classifier is in charge of classifying the detected pattern’s location classifier will estimate the location of the intruder with respect to the fence using geometric feature from images as inputs. The system is capable of activating the alarm, display the actual image and depict the location of the intruder when an intruder is detected. In detecting intruder posture, the system’s success rate of 88%. Overall system accuracy for day-time intruder localization is 83% and an accuracy of 88% for night-time intruder localization

  18. Advanced Map For Real-Time Process Control

    Science.gov (United States)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  19. Using knowledge rules for pharmacy mapping.

    Science.gov (United States)

    Shakib, Shaun C; Che, Chengjian; Lau, Lee Min

    2006-01-01

    The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) is used to encode and structure patient medication data for the Electronic Health Record (EHR) of the Department of Defense's (DoD's) Armed Forces Health Longitudinal Technology Application (AHLTA). HDD Subject Matter Experts (SMEs) are responsible for initial and maintenance mapping of disparate, standalone medication master files from all 100 DoD host sites worldwide to a single concept-based vocabulary, to accomplish semantic interoperability. To achieve higher levels of automation, SMEs began defining a growing set of knowledge rules. These knowledge rules were implemented in a pharmacy mapping tool, which enhanced consistency through automation and increased mapping rate by 29%.

  20. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  1. Automated mapping of building facades by machine learning

    DEFF Research Database (Denmark)

    Höhle, Joachim

    2014-01-01

    Facades of buildings contain various types of objects which have to be recorded for information systems. The article describes a solution for this task focussing on automated classification by means of machine learning techniques. Stereo pairs of oblique images are used to derive 3D point clouds...

  2. Analysis of engineering drawings and raster map images

    CERN Document Server

    Henderson, Thomas C

    2013-01-01

    Presents up-to-date methods and algorithms for the automated analysis of engineering drawings and digital cartographic maps Discusses automatic engineering drawing and map analysis techniques Covers detailed accounts of the use of unsupervised segmentation algorithms to map images

  3. Necklace maps

    NARCIS (Netherlands)

    Speckmann, B.; Verbeek, K.A.B.

    2010-01-01

    Statistical data associated with geographic regions is nowadays globally available in large amounts and hence automated methods to visually display these data are in high demand. There are several well-established thematic map types for quantitative data on the ratio-scale associated with regions:

  4. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    Science.gov (United States)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.

  5. Automated UAV-based mapping for airborne reconnaissance and video exploitation

    Science.gov (United States)

    Se, Stephen; Firoozfam, Pezhman; Goldstein, Norman; Wu, Linda; Dutkiewicz, Melanie; Pace, Paul; Naud, J. L. Pierre

    2009-05-01

    Airborne surveillance and reconnaissance are essential for successful military missions. Such capabilities are critical for force protection, situational awareness, mission planning, damage assessment and others. UAVs gather huge amount of video data but it is extremely labour-intensive for operators to analyse hours and hours of received data. At MDA, we have developed a suite of tools towards automated video exploitation including calibration, visualization, change detection and 3D reconstruction. The on-going work is to improve the robustness of these tools and automate the process as much as possible. Our calibration tool extracts and matches tie-points in the video frames incrementally to recover the camera calibration and poses, which are then refined by bundle adjustment. Our visualization tool stabilizes the video, expands its field-of-view and creates a geo-referenced mosaic from the video frames. It is important to identify anomalies in a scene, which may include detecting any improvised explosive devices (IED). However, it is tedious and difficult to compare video clips to look for differences manually. Our change detection tool allows the user to load two video clips taken from two passes at different times and flags any changes between them. 3D models are useful for situational awareness, as it is easier to understand the scene by visualizing it in 3D. Our 3D reconstruction tool creates calibrated photo-realistic 3D models from video clips taken from different viewpoints, using both semi-automated and automated approaches. The resulting 3D models also allow distance measurements and line-of- sight analysis.

  6. Automated Glacier Mapping using Object Based Image Analysis. Case Studies from Nepal, the European Alps and Norway

    Science.gov (United States)

    Vatle, S. S.

    2015-12-01

    Frequent and up-to-date glacier outlines are needed for many applications of glaciology, not only glacier area change analysis, but also for masks in volume or velocity analysis, for the estimation of water resources and as model input data. Remote sensing offers a good option for creating glacier outlines over large areas, but manual correction is frequently necessary, especially in areas containing supraglacial debris. We show three different workflows for mapping clean ice and debris-covered ice within Object Based Image Analysis (OBIA). By working at the object level as opposed to the pixel level, OBIA facilitates using contextual, spatial and hierarchical information when assigning classes, and additionally permits the handling of multiple data sources. Our first example shows mapping debris-covered ice in the Manaslu Himalaya, Nepal. SAR Coherence data is used in combination with optical and topographic data to classify debris-covered ice, obtaining an accuracy of 91%. Our second example shows using a high-resolution LiDAR derived DEM over the Hohe Tauern National Park in Austria. Breaks in surface morphology are used in creating image objects; debris-covered ice is then classified using a combination of spectral, thermal and topographic properties. Lastly, we show a completely automated workflow for mapping glacier ice in Norway. The NDSI and NIR/SWIR band ratio are used to map clean ice over the entire country but the thresholds are calculated automatically based on a histogram of each image subset. This means that in theory any Landsat scene can be inputted and the clean ice can be automatically extracted. Debris-covered ice can be included semi-automatically using contextual and morphological information.

  7. Automated side-chain model building and sequence assignment by template matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2002-01-01

    A method for automated macromolecular side-chain model building and for aligning the sequence to the map is described. An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer

  8. A Semantically Automated Protocol Adapter for Mapping SOAP Web Services to RESTful HTTP Format to Enable the Web Infrastructure, Enhance Web Service Interoperability and Ease Web Service Migration

    Directory of Open Access Journals (Sweden)

    Frank Doheny

    2012-04-01

    Full Text Available Semantic Web Services (SWS are Web Service (WS descriptions augmented with semantic information. SWS enable intelligent reasoning and automation in areas such as service discovery, composition, mediation, ranking and invocation. This paper applies SWS to a previous protocol adapter which, operating within clearly defined constraints, maps SOAP Web Services to RESTful HTTP format. However, in the previous adapter, the configuration element is manual and the latency implications are locally based. This paper applies SWS technologies to automate the configuration element and the latency tests are conducted in a more realistic Internet based setting.

  9. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  10. High Throughput Petrochronology and Sedimentary Provenance Analysis by Automated Phase Mapping and LAICPMS

    Science.gov (United States)

    Vermeesch, Pieter; Rittner, Martin; Petrou, Ethan; Omma, Jenny; Mattinson, Chris; Garzanti, Eduardo

    2017-11-01

    The first step in most geochronological studies is to extract dateable minerals from the host rock, which is time consuming, removes textural context, and increases the chance for sample cross contamination. We here present a new method to rapidly perform in situ analyses by coupling a fast scanning electron microscope (SEM) with Energy Dispersive X-ray Spectrometer (EDS) to a Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LAICPMS) instrument. Given a polished hand specimen, a petrographic thin section, or a grain mount, Automated Phase Mapping (APM) by SEM/EDS produces chemical and mineralogical maps from which the X-Y coordinates of the datable minerals are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic analysis. We apply the APM + LAICPMS method to three igneous, metamorphic, and sedimentary case studies. In the first case study, a polished slab of granite from Guernsey was scanned for zircon, producing a 609 ± 8 Ma weighted mean age. The second case study investigates a paragneiss from an ultra high pressure terrane in the north Qaidam terrane (Qinghai, China). One hundred seven small (25 µm) metamorphic zircons were analyzed by LAICPMS to confirm a 419 ± 4 Ma age of peak metamorphism. The third and final case study uses APM + LAICPMS to generate a large provenance data set and trace the provenance of 25 modern sediments from Angola, documenting longshore drift of Orange River sediments over a distance of 1,500 km. These examples demonstrate that APM + LAICPMS is an efficient and cost effective way to improve the quantity and quality of geochronological data.

  11. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  12. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  13. Towards automated electron holographic tomography for 3D mapping of electrostatic potentials

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, Daniel, E-mail: Daniel.Wolf@Triebenberg.de [Triebenberg Laboratory, Institute of Structure Physics, Technische Universitaet Dresden, 01062 Dresden (Germany); Lubk, Axel; Lichte, Hannes [Triebenberg Laboratory, Institute of Structure Physics, Technische Universitaet Dresden, 01062 Dresden (Germany); Friedrich, Heiner [Inorganic Chemistry and Catalysis, Debye Institute for Nanomaterials Science, Utrecht University, Sorbonnelaan 16, 3584 CA, Utrecht (Netherlands)

    2010-04-15

    Electron-holographic tomography (EHT), that is, the combination of off-axis electron holography with electron tomography, was successfully applied for the quantitative 3D mapping of electrostatic potentials at the nanoscale. Here we present the first software package (THOMAS) for semi-automated acquisition of holographic tilt series, a prerequisite for efficient data collection. Using THOMAS, the acquisition time for a holographic tilt series, consisting of object and reference holograms, is reduced by a factor of five on average, compared to the previous, completely manual approaches. Moreover, the existing software packages for retrieving amplitude and phase information from electron holograms have been extended, now including a one-step procedure for holographic tilt series reconstruction. Furthermore, a modified SIRT algorithm (WSIRT) was implemented for the quantitative 3D reconstruction of the electrostatic potential from the aligned phase tilt series. Finally, the application of EHT to a polystyrene latex sphere test-specimen and a pn-doped Ge 'needle'-shaped specimen are presented, illustrating the quantitative character of EHT. For both specimens the mean inner potential (MIP) values were accurately determined from the reconstructed 3D potential. For the Ge specimen, additionally the 'built-in' voltage across the pn junction of 0.5 V was obtained.

  14. Automated Plantation Mapping in Indonesia Using Remote Sensing Data

    Science.gov (United States)

    Karpatne, A.; Jia, X.; Khandelwal, A.; Kumar, V.

    2017-12-01

    Plantation mapping is critical for understanding and addressing deforestation, a key driver of climate change and ecosystem degradation. Unfortunately, most plantation maps are limited to small areas for specific years because they rely on visual inspection of imagery. In this work, we propose a data-driven approach which automatically generates yearly plantation maps for large regions using MODIS multi-spectral data. While traditional machine learning algorithms face manifold challenges in this task, e.g. imperfect training labels, spatio-temporal data heterogeneity, noisy and high-dimensional data, lack of evaluation data, etc., we introduce a novel deep learning-based framework that combines existing imperfect plantation products as training labels and models the spatio-temporal relationships of land covers. We also explores the post-processing steps based on Hidden Markov Model that further improve the detection accuracy. Then we conduct extensive evaluation of the generated plantation maps. Specifically, by randomly sampling and comparing with high-resolution Digital Globe imagery, we demonstrate that the generated plantation maps achieve both high precision and high recall. When compared with existing plantation mapping products, our detection can avoid both false positives and false negatives. Finally, we utilize the generated plantation maps in analyzing the relationship between forest fires and growth of plantations, which assists in better understanding the cause of deforestation in Indonesia.

  15. Developing a semi/automated protocol to post-process large volume, High-resolution airborne thermal infrared (TIR) imagery for urban waste heat mapping

    Science.gov (United States)

    Rahman, Mir Mustafizur

    In collaboration with The City of Calgary 2011 Sustainability Direction and as part of the HEAT (Heat Energy Assessment Technologies) project, the focus of this research is to develop a semi/automated 'protocol' to post-process large volumes of high-resolution (H-res) airborne thermal infrared (TIR) imagery to enable accurate urban waste heat mapping. HEAT is a free GeoWeb service, designed to help Calgary residents improve their home energy efficiency by visualizing the amount and location of waste heat leaving their homes and communities, as easily as clicking on their house in Google Maps. HEAT metrics are derived from 43 flight lines of TABI-1800 (Thermal Airborne Broadband Imager) data acquired on May 13--14, 2012 at night (11:00 pm--5:00 am) over The City of Calgary, Alberta (˜825 km 2) at a 50 cm spatial resolution and 0.05°C thermal resolution. At present, the only way to generate a large area, high-spatial resolution TIR scene is to acquire separate airborne flight lines and mosaic them together. However, the ambient sensed temperature within, and between flight lines naturally changes during acquisition (due to varying atmospheric and local micro-climate conditions), resulting in mosaicked images with different temperatures for the same scene components (e.g. roads, buildings), and mosaic join-lines arbitrarily bisect many thousands of homes. In combination these effects result in reduced utility and classification accuracy including, poorly defined HEAT Metrics, inaccurate hotspot detection and raw imagery that are difficult to interpret. In an effort to minimize these effects, three new semi/automated post-processing algorithms (the protocol) are described, which are then used to generate a 43 flight line mosaic of TABI-1800 data from which accurate Calgary waste heat maps and HEAT metrics can be generated. These algorithms (presented as four peer-reviewed papers)---are: (a) Thermal Urban Road Normalization (TURN)---used to mitigate the microclimatic

  16. Towards Automation in Landcover Mapping from LiDAR Data in Alpine Environment

    Science.gov (United States)

    Dorninger, Peter; Briese, Christian; Nothegger, Clemens; Klauser, Armin

    2010-05-01

    Digital terrain models derived from airborne LiDAR (often referred to as airborne laser scanning) are commonly used for various applications in geomorphology. The ongoing development in sensor technology makes flight campaigns with some 10 points per square meter economically feasible for large areas. Simultaneously, the achievable accuracy of the originally acquired points as well as those of the derived products increases due to improved measurement techniques. Additionally, full-waveform (FWF) laser scanning systems record the time-dependent strength of the backscattered signal. This allows for the determination of numerous points (i.e. echoes) for one emitted laser beam hitting multiple targets within its footprint. Practically, about five echoes may be determined from the digitized signal form. Furthermore, additional attributes can be determined for each echo. These are, for example, a reflectivity measure (amplitude), the widening of the echo (echo width), or the sequence of the echoes of a single shot. By considering the polar measurement range and atmospheric conditions, a physical calibration of such measurements is possible. The application of FWF information to increase the accuracy and the reliability of digital terrain models especially in areas with dense vegetation was shown by Doneus & Briese (2006). However, these additional attributes are rarely used for object or landcover classification. This is still the domain of automated image interpretation (e.g. Zebedin et al., 2006). Nevertheless, image interpretation has well known deficiencies in areas with vegetation or if shadows occur. Therefore, we tested a hybrid approach which uses conventional first echo / last echo (FE/LE) airborne laser scanning data (first and last pulse) and an RGB-orthophoto. The testing site is located in an alpine area in Tyrol, Austria. For the classification, topographic models, a slope map, a local roughness measure and a penetration ratio were determined from the

  17. Automated first-principles mapping for phase-change materials.

    Science.gov (United States)

    Esser, Marc; Maintz, Stefan; Dronskowski, Richard

    2017-04-05

    Plotting materials on bi-coordinate maps according to physically meaningful descriptors has a successful tradition in computational solid-state science spanning more than four decades. Equipped with new ab initio techniques introduced in this work, we generate an improved version of the treasure map for phase-change materials (PCMs) as introduced previously by Lencer et al. which, other than before, charts all industrially used PCMs correctly. Furthermore, we suggest seven new PCM candidates, namely SiSb 4 Te 7 , Si 2 Sb 2 Te 5 , SiAs 2 Te 4 , PbAs 2 Te 4 , SiSb 2 Te 4 , Sn 2 As 2 Te 5 , and PbAs 4 Te 7 , to be used as synthetic targets. To realize aforementioned maps based on orbital mixing (or "hybridization") and ionicity coordinates, structural information was first included into an ab initio numerical descriptor for sp 3 orbital mixing and then generalized beyond high-symmetry structures. In addition, a simple, yet powerful quantum-mechanical ionization measure also including structural information was introduced. Taken together, these tools allow for (automatically) generating materials maps solely relying on first-principles calculations. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    to point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method.......We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  19. Northeast Puerto Rico and Culebra Island - Benthic Habitat Map 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This benthic habitat map was created from a semi-automated habitat mapping process, using a combination of bathymetry, satellite imagery, aerial imagery and...

  20. AUTOMATED MAPPING OF PERMANENT PRESERVATION AREAS ON HILLTOPS

    Directory of Open Access Journals (Sweden)

    Guilherme de Castro Oliveira

    2016-03-01

    Full Text Available Permanent Preservation Areas (PPAs on hilltops are among the many areas protected by the New Forest Code in Brazil. Mapping of these involves difficult interpretation and application of the Law, as well a complex task of translating it in map algebra. This paper aims to present, in detail, a methodological model for delimitation of PPAs on hilltops, according to the Brazilian new Forest Code (NFC, Law 12,651/2012. The model was developed in Model Builder for ArcGIS 10.2, and is able to map the PPAs in any digital terrain model. However, field validations are required to verify its efficiency. There is need for legal standardization of criteria that may cause subjectivity in delimitation. The organization of these data on a large scale is very important, as example, to the Rural Environmental Registry, which provides georeferencing of all rural properties and its protected areas in Brazil.

  1. MapFactory - Towards a mapping design pattern for big geospatial data

    Science.gov (United States)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  2. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  3. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer's accuracy of 93% and a user's accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  4. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer’s accuracy of 93% and a user’s accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  5. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  6. An Automated Pipeline for Engineering Many-Enzyme Pathways: Computational Sequence Design, Pathway Expression-Flux Mapping, and Scalable Pathway Optimization.

    Science.gov (United States)

    Halper, Sean M; Cetnar, Daniel P; Salis, Howard M

    2018-01-01

    Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.

  7. Registering pointclouds of polyhedral buildings to 2D maps

    NARCIS (Netherlands)

    Remondino, F.; El-Hakim, S.; Gonzo, L.; Khoshelham, K.; Gorte, B.G.

    2009-01-01

    This paper presents a method for automated pointcloud-to-map registration using a plane matching technique. The registration is based on estimating a transformation between a set of planes inferred from the map and their corresponding planes extracted from the pointcloud. A plane matching algorithm

  8. Mapping the Recent US Hurricanes Triggered Flood Events in Near Real Time

    Science.gov (United States)

    Shen, X.; Lazin, R.; Anagnostou, E. N.; Wanik, D. W.; Brakenridge, G. R.

    2017-12-01

    Synthetic Aperture Radar (SAR) observations is the only reliable remote sensing data source to map flood inundation during severe weather events. Unfortunately, since state-of-art data processing algorithms cannot meet the automation and quality standard of a near-real-time (NRT) system, quality controlled inundation mapping by SAR currently depends heavily on manual processing, which limits our capability to quickly issue flood inundation maps at global scale. Specifically, most SAR-based inundation mapping algorithms are not fully automated, while those that are automated exhibit severe over- and/or under-detection errors that limit their potential. These detection errors are primarily caused by the strong overlap among the SAR backscattering probability density functions (PDF) of different land cover types. In this study, we tested a newly developed NRT SAR-based inundation mapping system, named Radar Produced Inundation Diary (RAPID), using Sentinel-1 dual polarized SAR data over recent flood events caused by Hurricanes Harvey, Irma, and Maria (2017). The system consists of 1) self-optimized multi-threshold classification, 2) over-detection removal using land-cover information and change detection, 3) under-detection compensation, and 4) machine-learning based correction. Algorithm details are introduced in another poster, H53J-1603. Good agreements were obtained by comparing the result from RAPID with visual interpretation of SAR images and manual processing from Dartmouth Flood Observatory (DFO) (See Figure 1). Specifically, the over- and under-detections that is typically noted in automated methods is significantly reduced to negligible levels. This performance indicates that RAPID can address the automation and accuracy issues of current state-of-art algorithms and has the potential to apply operationally on a number of satellite SAR missions, such as SWOT, ALOS, Sentinel etc. RAPID data can support many applications such as rapid assessment of damage

  9. Automated voxel-based analysis of brain perfusion SPECT for vasospasm after subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Iwabuchi, S.; Yokouchi, T.; Hayashi, M.; Kimura, H.; Tomiyama, A.; Hirata, Y.; Saito, N.; Harashina, J.; Nakayama, H.; Sato, K.; Aoki, K.; Samejima, H.; Ueda, M.; Terada, H.; Hamazaki, K.

    2008-01-01

    We evaluated regional cerebral blood flow (rCBF) during vasospasm after subarachnoid haemorrhage ISAH) using automated voxel-based analysis of brain perfusion single-photon emission computed tomography (SPELT). Brain perfusion SPECT was performed 7 to 10 days after onset of SAH. Automated voxel-based analysis of SPECT used a Z-score map that was calculated by comparing the patients data with a control database. In cases where computed tomography (CT) scans detected an ischemic region due to vasospasm, automated voxel-based analysis of brain perfusion SPECT revealed dramatically reduced rCBF (Z-score ≤ -4). No patients with mildly or moderately diminished rCBF (Z-score > -3) progressed to cerebral infarction. Some patients with a Z-score < -4 did not progress to cerebral infarction after active treatment with a angioplasty. Three-dimensional images provided detailed anatomical information and helped us to distinguish surgical sequelae from vasospasm. In conclusion, automated voxel-based analysis of brain perfusion SPECT using a Z-score map is helpful in evaluating decreased rCBF due to vasospasm. (author)

  10. Nano-scale orientation mapping of graphite in cast irons

    International Nuclear Information System (INIS)

    Theuwissen, Koenraad; Lacaze, Jacques; Véron, Muriel; Laffont, Lydia

    2014-01-01

    A diametrical section of a graphite spheroid from a ductile iron sample was prepared using the focused ion beam-lift out technique. Characterization of this section was carried out through automated crystal orientation mapping in a transmission electron microscope. This new technique automatically collects electron diffraction patterns and matches them with precalculated templates. The results of this investigation are crystal orientation and phase maps of the specimen, which bring new light to the understanding of growth mechanisms of this peculiar graphite morphology. This article shows that mapping the orientation of carbon-based materials such as graphite, which is difficult to achieve with conventional techniques, can be performed automatically and at high spatial resolution using automated crystal orientation mapping in a transmission electron microscope. - Highlights: • ACOM/TEM can be used to study the crystal orientation of carbon-based materials. • A spheroid is formed by conical sectors radiating from a central nuclei. • Misorientations exist within the conical sectors, defining various orientation domains

  11. Current trends in geomorphological mapping

    Science.gov (United States)

    Seijmonsbergen, A. C.

    2012-04-01

    Geomorphological mapping is a world currently in motion, driven by technological advances and the availability of new high resolution data. As a consequence, classic (paper) geomorphological maps which were the standard for more than 50 years are rapidly being replaced by digital geomorphological information layers. This is witnessed by the following developments: 1. the conversion of classic paper maps into digital information layers, mainly performed in a digital mapping environment such as a Geographical Information System, 2. updating the location precision and the content of the converted maps, by adding more geomorphological details, taken from high resolution elevation data and/or high resolution image data, 3. (semi) automated extraction and classification of geomorphological features from digital elevation models, broadly separated into unsupervised and supervised classification techniques and 4. New digital visualization / cartographic techniques and reading interfaces. Newly digital geomorphological information layers can be based on manual digitization of polygons using DEMs and/or aerial photographs, or prepared through (semi) automated extraction and delineation of geomorphological features. DEMs are often used as basis to derive Land Surface Parameter information which is used as input for (un) supervised classification techniques. Especially when using high-res data, object-based classification is used as an alternative to traditional pixel-based classifications, to cluster grid cells into homogeneous objects, which can be classified as geomorphological features. Classic map content can also be used as training material for the supervised classification of geomorphological features. In the classification process, rule-based protocols, including expert-knowledge input, are used to map specific geomorphological features or entire landscapes. Current (semi) automated classification techniques are increasingly able to extract morphometric, hydrological

  12. Contour Detection for UAV-Based Cadastral Mapping

    Directory of Open Access Journals (Sweden)

    Sophie Crommelinck

    2017-02-01

    Full Text Available Unmanned aerial vehicles (UAVs provide a flexible and low-cost solution for the acquisition of high-resolution data. The potential of high-resolution UAV imagery to create and update cadastral maps is being increasingly investigated. Existing procedures generally involve substantial fieldwork and many manual processes. Arguably, multiple parts of UAV-based cadastral mapping workflows could be automated. Specifically, as many cadastral boundaries coincide with visible boundaries, they could be extracted automatically using image analysis methods. This study investigates the transferability of gPb contour detection, a state-of-the-art computer vision method, to remotely sensed UAV images and UAV-based cadastral mapping. Results show that the approach is transferable to UAV data and automated cadastral mapping: object contours are comprehensively detected at completeness and correctness rates of up to 80%. The detection quality is optimal when the entire scene is covered with one orthoimage, due to the global optimization of gPb contour detection. However, a balance between high completeness and correctness is hard to achieve, so a combination with area-based segmentation and further object knowledge is proposed. The localization quality exhibits the usual dependency on ground resolution. The approach has the potential to accelerate the process of general boundary delineation during the creation and updating of cadastral maps.

  13. Advanced paratransit system : an application of digital map, automated vehicle scheduling and vehicle location systems

    Science.gov (United States)

    1997-05-01

    This report documents and evaluates an advanced Paratransit system demonstration project. The Santa Clara Valley Transportation Agency (SCVTA), via OUTREACH, implemented such a system, comprised of an automated trip scheduling system (ATSS) and autom...

  14. Automated Processing of 2-D Gel Electrophoretograms of Genomic DNA for Hunting Pathogenic DNA Molecular Changes.

    Science.gov (United States)

    Takahashi; Nakazawa; Watanabe; Konagaya

    1999-01-01

    We have developed the automated processing algorithms for 2-dimensional (2-D) electrophoretograms of genomic DNA based on RLGS (Restriction Landmark Genomic Scanning) method, which scans the restriction enzyme recognition sites as the landmark and maps them onto a 2-D electrophoresis gel. Our powerful processing algorithms realize the automated spot recognition from RLGS electrophoretograms and the automated comparison of a huge number of such images. In the final stage of the automated processing, a master spot pattern, on which all the spots in the RLGS images are mapped at once, can be obtained. The spot pattern variations which seemed to be specific to the pathogenic DNA molecular changes can be easily detected by simply looking over the master spot pattern. When we applied our algorithms to the analysis of 33 RLGS images derived from human colon tissues, we successfully detected several colon tumor specific spot pattern changes.

  15. Two techniques for mapping and area estimation of small grains in California using Landsat digital data

    Science.gov (United States)

    Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.

    1984-01-01

    Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.

  16. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  17. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Comparison of manual and automated quantification methods of 123I-ADAM

    International Nuclear Information System (INIS)

    Kauppinen, T.; Keski-Rahkonen, A.; Sihvola, E.; Helsinki Univ. Central Hospital

    2005-01-01

    123 I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of 123 I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from 123 I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing 123 I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  19. A physical map of the heterozygous grapevine 'Cabernet Sauvignon' allows mapping candidate genes for disease resistance

    Directory of Open Access Journals (Sweden)

    Scalabrin Simone

    2008-06-01

    Full Text Available Abstract Background Whole-genome physical maps facilitate genome sequencing, sequence assembly, mapping of candidate genes, and the design of targeted genetic markers. An automated protocol was used to construct a Vitis vinifera 'Cabernet Sauvignon' physical map. The quality of the result was addressed with regard to the effect of high heterozygosity on the accuracy of contig assembly. Its usefulness for the genome-wide mapping of genes for disease resistance, which is an important trait for grapevine, was then assessed. Results The physical map included 29,727 BAC clones assembled into 1,770 contigs, spanning 715,684 kbp, and corresponding to 1.5-fold the genome size. Map inflation was due to high heterozygosity, which caused either the separation of allelic BACs in two different contigs, or local mis-assembly in contigs containing BACs from the two haplotypes. Genetic markers anchored 395 contigs or 255,476 kbp to chromosomes. The fully automated assembly and anchorage procedures were validated by BAC-by-BAC blast of the end sequences against the grape genome sequence, unveiling 7.3% of chimerical contigs. The distribution across the physical map of candidate genes for non-host and host resistance, and for defence signalling pathways was then studied. NBS-LRR and RLK genes for host resistance were found in 424 contigs, 133 of them (32% were assigned to chromosomes, on which they are mostly organised in clusters. Non-host and defence signalling genes were found in 99 contigs dispersed without a discernable pattern across the genome. Conclusion Despite some limitations that interfere with the correct assembly of heterozygous clones into contigs, the 'Cabernet Sauvignon' physical map is a useful and reliable intermediary step between a genetic map and the genome sequence. This tool was successfully exploited for a quick mapping of complex families of genes, and it strengthened previous clues of co-localisation of major NBS-LRR clusters and

  20. Archaeological field survey automation: concurrent multisensor site mapping and automated analysis

    Science.gov (United States)

    Józefowicz, Mateusz; Sokolov, Oleksandr; Meszyński, Sebastian; Siemińska, Dominika; Kołosowski, Przemysław

    2016-04-01

    ABM SE develops mobile robots (rovers) used for analog research of Mars exploration missions. The rovers are all-terrain exploration platforms, carrying third-party payloads: scientific instrumentation. "Wisdom" ground penetrating radar for Exomars mission has been tested onboard, as well as electrical resistivity module and other devices. Robot has operated in various environments, such as Central European countryside, Dachstein ice caves or Sahara, Morocco (controlled remotely via satellite from Toruń, Poland. Currently ABM SE works on local and global positioning system for a Mars rover basing on image and IMU data. This is performed under a project from ESA. In the next Mars rover missions a Mars GIS model will be build, including an acquired GPR profile, DEM and regular image data, integrated into a concurrent 3D terrain model. It is proposed to use similar approach in surveys of archaeological sites, especially those, where solid architecture remains can be expected at shallow depths or being partially exposed. It is possible to deploy a rover that will concurrently map a selected site with GPR, 2D and 3D cameras to create a site model. The rover image processing algorithms are capable of automatic tracing of distinctive features (such as exposed structure remains on a desert ground, differences in color of the ground, etc.) and to mark regularities on a created map. It is also possible to correlate the 3D map with an aerial photo taken under any angle to achieve interpretation synergy. Currently the algorithms are an interpretation aid and their results must be confirmed by a human. The advantages of a rover over traditional approaches, such as a manual cart or a drone include: a) long hours of continuous work or work in unfavorable environment, such as high desert, frozen water pools or large areas, b) concurrent multisensory data acquisition, c) working from the ground level enables capturing of sites obstructed from the air (trees), d) it is possible to

  1. Automated vocabulary discovery for geo-parsing online epidemic intelligence.

    Science.gov (United States)

    Keller, Mikaela; Freifeld, Clark C; Brownstein, John S

    2009-11-24

    Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  2. Automated detection of submerged navigational obstructions in freshwater impoundments with hull mounted sidescan sonar

    Science.gov (United States)

    Morris, Phillip A.

    The prevalence of low-cost side scanning sonar systems mounted on small recreational vessels has created improved opportunities to identify and map submerged navigational hazards in freshwater impoundments. However, these economical sensors also present unique challenges for automated techniques. This research explores related literature in automated sonar imagery processing and mapping technology, proposes and implements a framework derived from these sources, and evaluates the approach with video collected from a recreational grade sonar system. Image analysis techniques including optical character recognition and an unsupervised computer automated detection (CAD) algorithm are employed to extract the transducer GPS coordinates and slant range distance of objects protruding from the lake bottom. The retrieved information is formatted for inclusion into a spatial mapping model. Specific attributes of the sonar sensors are modeled such that probability profiles may be projected onto a three dimensional gridded map. These profiles are computed from multiple points of view as sonar traces crisscross or come near each other. As lake levels fluctuate over time so do the elevation points of view. With each sonar record, the probability of a hazard existing at certain elevations at the respective grid points is updated with Bayesian mechanics. As reinforcing data is collected, the confidence of the map improves. Given a lake's current elevation and a vessel draft, a final generated map can identify areas of the lake that have a high probability of containing hazards that threaten navigation. The approach is implemented in C/C++ utilizing OpenCV, Tesseract OCR, and QGIS open source software and evaluated in a designated test area at Lake Lavon, Collin County, Texas.

  3. Automated MRI segmentation for individualized modeling of current flow in the human head.

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible

  4. Comparison of CT perfusion summary maps to early diffusion-weighted images in suspected acute middle cerebral artery stroke

    Energy Technology Data Exchange (ETDEWEB)

    Benson, John; Payabvash, Seyedmehdi [Hennepin County and University of Minnesota Medical Centers, Department of Radiology, Minneapolis, MN (United States); Salazar, Pascal [Vital Images, A Division of Toshiba Medical, Minnetonka, MN (United States); Jagadeesan, Bharathi; Palmer, Christopher S.; Truwit, Charles L. [Hennepin County and University of Minnesota Medical Centers, Department of Radiology, Minneapolis, MN (United States); McKinney, Alexander M., E-mail: mckinrad@umn.edu [Hennepin County and University of Minnesota Medical Centers, Department of Radiology, Minneapolis, MN (United States)

    2015-04-15

    Objectives: To assess the accuracy and reliability of one vendor's (Vital Images, Toshiba Medical, Minnetonka, MN) automated CT perfusion (CTP) summary maps in identification and volume estimation of infarcted tissue in patients with acute middle cerebral artery (MCA) distribution infarcts. Subjects and methods: From 1085 CTP examinations over 5.5 years, 43 diffusion-weighted imaging (DWI)-positive patients were included who underwent both CTP and DWI <12 h after symptom onset, with another 43 age-matched patients as controls (DWI-negative). Automated delay-corrected postprocessing software (DC-SVD) generated both infarct “core only” and “core + penumbra” CTP summary maps. Three reviewers independently tabulated Alberta Stroke Program Early CT scores (ASPECTS) of both CTP summary maps and coregistered DWI. Results: Of 86 included patients, 36 had DWI infarct volumes ≤70 ml, 7 had volumes >70 ml, and 43 were negative; the automated CTP “core only” map correctly classified each as >70 ml or ≤70 ml, while the “core + penumbra” map misclassified 4 as >70 ml. There were strong correlations between DWI volume with both summary map-based volumes: “core only” (r = 0.93), and “core + penumbra” (r = 0.77) (both p < 0.0001). Agreement between ASPECTS scores of infarct core on DWI with summary maps was 0.65–0.74 for “core only” map, and 0.61–0.65 for “core + penumbra” (both p < 0.0001). Using DWI-based ASPECTS scores as the standard, the accuracy of the CTP-based maps were 79.1–86.0% for the “core only” map, and 83.7–88.4% for “core + penumbra.” Conclusion: Automated CTP summary maps appear to be relatively accurate in both the detection of acute MCA distribution infarcts, and the discrimination of volumes using a 70 ml threshold.

  5. A semi-automated approach for mapping geomorphology of El Bardawil Lake, Northern Sinai, Egypt, using integrated remote sensing and GIS techniques

    Directory of Open Access Journals (Sweden)

    Nabil Sayed Embabi

    2014-06-01

    Full Text Available Among the other coastal lakes of the Mediterranean northern coast of Egypt, Bardawil Lake is a unique lagoon, as it is fed only by seawater. The lagoon is composed of two main basins, and several other internal small basins interconnected to one another. Although the general geomorphologic characteristics are treated in some regional studies, we used a semi-automated approach based on a wide variety of digital image processing for mapping the major geomorphological landforms of the lake on a medium scale of 1:250,000. The approach is based primarily on data fusion of Landsat ETM+ image, and validated by other ancillary spatial data (e.g. topographic maps, Google images and GPS in situ data. Interpretations of high resolution space images by Google Earth and the large-scale topographic maps (1:25,000, in specific, revealed new microforms and some detailed geomorphologic aspects with the aid of GPS measurements. Small sand barriers, submerged sand dunes, tidal channels, fans and flats, and micro-lagoons are the recurrent forms in the lake. The approach used in this study could be widely applied to study the low-lying coastal lands along the Nile Delta. However, it is concluded from geological data and geomorphologic aspects that Bardawil Lake is of a tectonic origin; it was much deeper than it is currently, and has been filled with sediments mostly since the Flandrian transgression (∼8–6 ka bp.

  6. Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.

    Science.gov (United States)

    Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S

    2013-03-01

    Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.

  7. Comparison of manual and automated quantification methods of {sup 123}I-ADAM

    Energy Technology Data Exchange (ETDEWEB)

    Kauppinen, T. [Helsinki Univ. Central Hospital (Finland). HUS Helsinki Medical Imaging Center; Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Koskela, A.; Ahonen, A. [Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Diemling, M. [Hermes Medical Solutions, Stockholm (Sweden); Keski-Rahkonen, A.; Sihvola, E. [Helsinki Univ. (Finland). Dept. of Public Health; Helsinki Univ. Central Hospital (Finland). Dept. of Psychiatry

    2005-07-01

    {sup 123}I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of {sup 123}I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from {sup 123}I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing {sup 123}I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  8. Automated side-chain model building and sequence assignment by template matching.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm is described for automated building of side chains in an electron-density map once a main-chain model is built and for alignment of the protein sequence to the map. The procedure is based on a comparison of electron density at the expected side-chain positions with electron-density templates. The templates are constructed from average amino-acid side-chain densities in 574 refined protein structures. For each contiguous segment of main chain, a matrix with entries corresponding to an estimate of the probability that each of the 20 amino acids is located at each position of the main-chain model is obtained. The probability that this segment corresponds to each possible alignment with the sequence of the protein is estimated using a Bayesian approach and high-confidence matches are kept. Once side-chain identities are determined, the most probable rotamer for each side chain is built into the model. The automated procedure has been implemented in the RESOLVE software. Combined with automated main-chain model building, the procedure produces a preliminary model suitable for refinement and extension by an experienced crystallographer.

  9. Automating the Fireshed Assessment Process with ArcGIS

    Science.gov (United States)

    Alan Ager; Klaus Barber

    2006-01-01

    A library of macros was developed to automate the Fireshed process within ArcGIS. The macros link a number of vegetation simulation and wildfire behavior models (FVS, SVS, FARSITE, and FlamMap) with ESRI geodatabases, desktop software (Access, Excel), and ArcGIS. The macros provide for (1) an interactive linkage between digital imagery, vegetation data, FVS-FFE, and...

  10. National Scale Rainfall Map Based on Linearly Interpolated Data from Automated Weather Stations and Rain Gauges

    Science.gov (United States)

    Alconis, Jenalyn; Eco, Rodrigo; Mahar Francisco Lagmay, Alfredo; Lester Saddi, Ivan; Mongaya, Candeze; Figueroa, Kathleen Gay

    2014-05-01

    In response to the slew of disasters that devastates the Philippines on a regular basis, the national government put in place a program to address this problem. The Nationwide Operational Assessment of Hazards, or Project NOAH, consolidates the diverse scientific research being done and pushes the knowledge gained to the forefront of disaster risk reduction and management. Current activities of the project include installing rain gauges and water level sensors, conducting LIDAR surveys of critical river basins, geo-hazard mapping, and running information education campaigns. Approximately 700 automated weather stations and rain gauges installed in strategic locations in the Philippines hold the groundwork for the rainfall visualization system in the Project NOAH web portal at http://noah.dost.gov.ph. The system uses near real-time data from these stations installed in critical river basins. The sensors record the amount of rainfall in a particular area as point data updated every 10 to 15 minutes. The sensor sends the data to a central server either via GSM network or satellite data transfer for redundancy. The web portal displays the sensors as a placemarks layer on a map. When a placemark is clicked, it displays a graph of the rainfall data for the past 24 hours. The rainfall data is harvested by batch determined by a one-hour time frame. The program uses linear interpolation as the methodology implemented to visually represent a near real-time rainfall map. The algorithm allows very fast processing which is essential in near real-time systems. As more sensors are installed, precision is improved. This visualized dataset enables users to quickly discern where heavy rainfall is concentrated. It has proven invaluable on numerous occasions, such as last August 2013 when intense to torrential rains brought about by the enhanced Southwest Monsoon caused massive flooding in Metro Manila. Coupled with observations from Doppler imagery and water level sensors along the

  11. A Cognitive Support Framework for Ontology Mapping

    Science.gov (United States)

    Falconer, Sean M.; Storey, Margaret-Anne

    Ontology mapping is the key to data interoperability in the semantic web. This problem has received a lot of research attention, however, the research emphasis has been mostly devoted to automating the mapping process, even though the creation of mappings often involve the user. As industry interest in semantic web technologies grows and the number of widely adopted semantic web applications increases, we must begin to support the user. In this paper, we combine data gathered from background literature, theories of cognitive support and decision making, and an observational case study to propose a theoretical framework for cognitive support in ontology mapping tools. We also describe a tool called CogZ that is based on this framework.

  12. Automated vocabulary discovery for geo-parsing online epidemic intelligence

    Directory of Open Access Journals (Sweden)

    Freifeld Clark C

    2009-11-01

    Full Text Available Abstract Background Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Results Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. Conclusion The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  13. EzMap: a simple pipeline for reproducible analysis of the human virome.

    Science.gov (United States)

    Czeczko, Patrick; Greenway, Steven C; de Koning, A P Jason

    2017-08-15

    In solid-organ transplant recipients, a delicate balance between immunosuppression and immunocompetence must be achieved, which can be difficult to monitor in real-time. Shotgun sequencing of cell-free DNA (cfDNA) has been recently proposed as a new way to indirectly assess immune function in transplant recipients through analysis of the status of the human virome. To facilitate exploration of the utility of the human virome as an indicator of immune status, and to enable rapid, straightforward analyses by clinicians, we developed a fully automated computational pipeline, EzMap, for performing metagenomic analysis of the human virome. EzMap combines a number of tools to clean, filter, and subtract WGS reads by mapping to a reference human assembly. The relative abundance of each virus present is estimated using a maximum likelihood approach that accounts for genome size, and results are presented with interactive visualizations and taxonomy-based summaries that enable rapid insights. The pipeline is automated to run on both workstations and computing clusters for all steps. EzMap automates an otherwise tedious and time-consuming protocol and aims to facilitate rapid and reproducible insights from cfDNA. EzMap is freely available at https://github.com/dekoning-lab/ezmap. jason.dekoning@ucalgary.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Automated lineament mapping from remotely sensed data: case ...

    African Journals Online (AJOL)

    The automatic procedures of PCI LINE and Imagine Objective Line Extraction were adopted to extract lineaments from Landsat OLI imagery (band 6) and Digital Elevation Models (SPOT DEM and ASTER DEM) of Osun Drainage Basin, Southwestern Nigeria. This was with a view to optimally map lineaments within the basin ...

  15. Automated delineation and characterization of drumlins using a localized contour tree approach

    Science.gov (United States)

    Wang, Shujie; Wu, Qiusheng; Ward, Dylan

    2017-10-01

    Drumlins are ubiquitous landforms in previously glaciated regions, formed through a series of complex subglacial processes operating underneath the paleo-ice sheets. Accurate delineation and characterization of drumlins are essential for understanding the formation mechanism of drumlins as well as the flow behaviors and basal conditions of paleo-ice sheets. Automated mapping of drumlins is particularly important for examining the distribution patterns of drumlins across large spatial scales. This paper presents an automated vector-based approach to mapping drumlins from high-resolution light detection and ranging (LiDAR) data. The rationale is to extract a set of concentric contours by building localized contour trees and establishing topological relationships. This automated method can overcome the shortcomings of previously manual and automated methods for mapping drumlins, for instance, the azimuthal biases during the generation of shaded relief images. A case study was carried out over a portion of the New York Drumlin Field. Overall 1181 drumlins were identified from the LiDAR-derived DEM across the study region, which had been underestimated in previous literature. The delineation results were visually and statistically compared to the manual digitization results. The morphology of drumlins was characterized by quantifying the length, width, elongation ratio, height, area, and volume. Statistical and spatial analyses were conducted to examine the distribution pattern and spatial variability of drumlin size and form. The drumlins and the morphologic characteristics exhibit significant spatial clustering rather than randomly distributed patterns. The form of drumlins varies from ovoid to spindle shapes towards the downstream direction of paleo ice flows, along with the decrease in width, area, and volume. This observation is in line with previous studies, which may be explained by the variations in sediment thickness and/or the velocity increases of ice flows

  16. An automated dose tracking system for adaptive radiation therapy.

    Science.gov (United States)

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient

  17. High resolution hybrid optical and acoustic sea floor maps (Invited)

    Science.gov (United States)

    Roman, C.; Inglis, G.

    2013-12-01

    This abstract presents a method for creating hybrid optical and acoustic sea floor reconstructions at centimeter scale grid resolutions with robotic vehicles. Multibeam sonar and stereo vision are two common sensing modalities with complementary strengths that are well suited for data fusion. We have recently developed an automated two stage pipeline to create such maps. The steps can be broken down as navigation refinement and map construction. During navigation refinement a graph-based optimization algorithm is used to align 3D point clouds created with both the multibeam sonar and stereo cameras. The process combats the typical growth in navigation error that has a detrimental affect on map fidelity and typically introduces artifacts at small grid sizes. During this process we are able to automatically register local point clouds created by each sensor to themselves and to each other where they overlap in a survey pattern. The process also estimates the sensor offsets, such as heading, pitch and roll, that describe how each sensor is mounted to the vehicle. The end results of the navigation step is a refined vehicle trajectory that ensures the points clouds from each sensor are consistently aligned, and the individual sensor offsets. In the mapping step, grid cells in the map are selectively populated by choosing data points from each sensor in an automated manner. The selection process is designed to pick points that preserve the best characteristics of each sensor and honor some specific map quality criteria to reduce outliers and ghosting. In general, the algorithm selects dense 3D stereo points in areas of high texture and point density. In areas where the stereo vision is poor, such as in a scene with low contrast or texture, multibeam sonar points are inserted in the map. This process is automated and results in a hybrid map populated with data from both sensors. Additional cross modality checks are made to reject outliers in a robust manner. The final

  18. Development of an automated processing system for potential fishing zone forecast

    Science.gov (United States)

    Ardianto, R.; Setiawan, A.; Hidayat, J. J.; Zaky, A. R.

    2017-01-01

    The Institute for Marine Research and Observation (IMRO) - Ministry of Marine Affairs and Fisheries Republic of Indonesia (MMAF) has developed a potential fishing zone (PFZ) forecast using satellite data, called Peta Prakiraan Daerah Penangkapan Ikan (PPDPI). Since 2005, IMRO disseminates everyday PPDPI maps for fisheries marine ports and 3 days average for national areas. The accuracy in determining the PFZ and processing time of maps depend much on the experience of the operators creating them. This paper presents our research in developing an automated processing system for PPDPI in order to increase the accuracy and shorten processing time. PFZ are identified by combining MODIS sea surface temperature (SST) and chlorophyll-a (CHL) data in order to detect the presence of upwelling, thermal fronts and biological productivity enhancement, where the integration of these phenomena generally representing the PFZ. The whole process involves data download, map geo-process as well as layout that are carried out automatically by Python and ArcPy. The results showed that the automated processing system could be used to reduce the operator’s dependence on determining PFZ and speed up processing time.

  19. SoilGrids1km--global soil information based on automated mapping.

    Directory of Open Access Journals (Sweden)

    Tomislav Hengl

    Full Text Available BACKGROUND: Soils are widely recognized as a non-renewable natural resource and as biophysical carbon sinks. As such, there is a growing requirement for global soil information. Although several global soil information systems already exist, these tend to suffer from inconsistencies and limited spatial detail. METHODOLOGY/PRINCIPAL FINDINGS: We present SoilGrids1km--a global 3D soil information system at 1 km resolution--containing spatial predictions for a selection of soil properties (at six standard depths: soil organic carbon (g kg-1, soil pH, sand, silt and clay fractions (%, bulk density (kg m-3, cation-exchange capacity (cmol+/kg, coarse fragments (%, soil organic carbon stock (t ha-1, depth to bedrock (cm, World Reference Base soil groups, and USDA Soil Taxonomy suborders. Our predictions are based on global spatial prediction models which we fitted, per soil variable, using a compilation of major international soil profile databases (ca. 110,000 soil profiles, and a selection of ca. 75 global environmental covariates representing soil forming factors. Results of regression modeling indicate that the most useful covariates for modeling soils at the global scale are climatic and biomass indices (based on MODIS images, lithology, and taxonomic mapping units derived from conventional soil survey (Harmonized World Soil Database. Prediction accuracies assessed using 5-fold cross-validation were between 23-51%. CONCLUSIONS/SIGNIFICANCE: SoilGrids1km provide an initial set of examples of soil spatial data for input into global models at a resolution and consistency not previously available. Some of the main limitations of the current version of SoilGrids1km are: (1 weak relationships between soil properties/classes and explanatory variables due to scale mismatches, (2 difficulty to obtain covariates that capture soil forming factors, (3 low sampling density and spatial clustering of soil profile locations. However, as the SoilGrids system is

  20. Evaluation of using digital gravity field models for zoning map creation

    Science.gov (United States)

    Loginov, Dmitry

    2018-05-01

    At the present time the digital cartographic models of geophysical fields are taking a special significance into geo-physical mapping. One of the important directions to their application is the creation of zoning maps, which allow taking into account the morphology of geophysical field in the implementation automated choice of contour intervals. The purpose of this work is the comparative evaluation of various digital models in the creation of integrated gravity field zoning map. For comparison were chosen the digital model of gravity field of Russia, created by the analog map with scale of 1 : 2 500 000, and the open global model of gravity field of the Earth - WGM2012. As a result of experimental works the four integrated gravity field zoning maps were obtained with using raw and processed data on each gravity field model. The study demonstrates the possibility of open data use to create integrated zoning maps with the condition to eliminate noise component of model by processing in specialized software systems. In this case, for solving problem of contour intervals automated choice the open digital models aren't inferior to regional models of gravity field, created for individual countries. This fact allows asserting about universality and independence of integrated zoning maps creation regardless of detail of a digital cartographic model of geo-physical fields.

  1. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. The Buccaneer software for automated model building. 1. Tracing protein chains.

    Science.gov (United States)

    Cowtan, Kevin

    2006-09-01

    A new technique for the automated tracing of protein chains in experimental electron-density maps is described. The technique relies on the repeated application of an oriented electron-density likelihood target function to identify likely C(alpha) positions. This function is applied both in the location of a few promising ;seed' positions in the map and to grow those initial C(alpha) positions into extended chain fragments. Techniques for assembling the chain fragments into an initial chain trace are discussed.

  3. 2nd Road Vehicle Automation Workshop

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation

    2014-01-01

    This contributed volume covers all relevant aspects of road vehicle automation including societal impacts, legal matters, and technology innovation from the perspectives of a multitude of public and private actors. It is based on an expert workshop organized by the Transportation Research Board at Stanford University in July 2013. The target audience primarily comprises academic researchers, but the book may also be of interest to practitioners and professionals. Higher levels of road vehicle automation are considered beneficial for road safety, energy efficiency, productivity, convenience, and social inclusion. The necessary key technologies in the fields of object-recognition systems, data processing, and infrastructure communication have been consistently developed over the recent years and are mostly available on the market today. However, there is still a need for substantial research and development, e.g. with interactive maps, data processing, functional safety, and the fusion of different data sources...

  4. Automated georeference of the 1: 20,000 Romanian maps under Lambert-Cholesky (1916-1959) projection system

    Science.gov (United States)

    Rus, I.; Balint, C.; Craciunescu, V.; Constantinescu, S.; Ovejanu, I.; Bartos-Elekes, Zs.

    2009-04-01

    ), into the Lambert-Cholensky projection, were made by the Romanian officers. The basic map, called „Plan Director de Tragere" was drafted under 1:20000 scale in 2118 drawings, covering all the Romanian territory. Under graphical aspect, such drawings had a 75 cm length (the equivalent of 15 km of land), respectively 50 cm (the equivalent of 12 km of land). Usually, at the upper part of the map, frequently to the left side, less frequently to the right side, the drawings nomenclature appeared, made following the principle: the first two letter meant the columns number and the last two characters represented the lines number. So, the drawing whose south-west corner had the Cartesian co-ordinate of 10 km, 20 km would have received the codification 1020. The manual georeference of the entire map sheets database is a meticulous and time consuming process. To overcome this disadvantages and to increase the rectification precision an automated procedure was created. The whole process of raster sheets georeference is done by a specially developed tool which relay on radon transform to extract, even in degraded and noisy conditions of original rasters, all the straight lines and form a graticule network. Then, by knowing the sheets spatial positions out of its labeling schema, all intersection points in the graticule are labeled with correct coordinates, so by this way sheets are rapidly batch georeferenced in the most accurate fashion.

  5. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Science.gov (United States)

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Burks, M.B. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Hoop, R.C.; Hoffman, E.P. [Univ. of Pittsburgh School of Medicine, PA (United States)

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  7. A Software Architecture for Simulation Support in Building Automation

    Directory of Open Access Journals (Sweden)

    Sergio Leal

    2014-07-01

    Full Text Available Building automation integrates the active components in a building and, thus, has to connect components of different industries. The goal is to provide reliable and efficient operation. This paper describes how simulation can support building automation and how the deployment process of simulation assisted building control systems can be structured. We look at the process as a whole and map it to a set of formally described workflows that can partly be automated. A workbench environment supports the process execution by means of improved planning, collaboration and deployment. This framework allows integration of existing tools, as well as manual tasks, and is, therefore, many more intricate than regular software deployment tools. The complex environment of building commissioning requires expertise in different domains, especially lighting, heating, ventilation, air conditioning, measurement and control technology, as well as energy efficiency; therefore, we present a framework for building commissioning and describe a deployment process that is capable of supporting the various phases of this approach.

  8. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  9. Global Rapid Flood Mapping System with Spaceborne SAR Data

    Science.gov (United States)

    Yun, S. H.; Owen, S. E.; Hua, H.; Agram, P. S.; Fattahi, H.; Liang, C.; Manipon, G.; Fielding, E. J.; Rosen, P. A.; Webb, F.; Simons, M.

    2017-12-01

    As part of the Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards, at NASA's Jet Propulsion Laboratory and California Institute of Technology, we have developed an automated system that produces derived products for flood extent map generation using spaceborne SAR data. The system takes user's input of area of interest polygons and time window for SAR data search (pre- and post-event). Then the system automatically searches and downloads SAR data, processes them to produce coregistered SAR image pairs, and generates log amplitude ratio images from each pair. Currently the system is automated to support SAR data from the European Space Agency's Sentinel-1A/B satellites. We have used the system to produce flood extent maps from Sentinel-1 SAR data for the May 2017 Sri Lanka floods, which killed more than 200 people and displaced about 600,000 people. Our flood extent maps were delivered to the Red Cross to support response efforts. Earlier we also responded to the historic August 2016 Louisiana floods in the United States, which claimed 13 people's lives and caused over $10 billion property damage. For this event, we made synchronized observations from space, air, and ground in close collaboration with USGS and NOAA. The USGS field crews acquired ground observation data, and NOAA acquired high-resolution airborne optical imagery within the time window of +/-2 hours of the SAR data acquisition by JAXA's ALOS-2 satellite. The USGS coordinates of flood water boundaries were used to calibrate our flood extent map derived from the ALOS-2 SAR data, and the map was delivered to FEMA for estimating the number of households affected. Based on the lessons learned from this response effort, we customized the ARIA system automation for rapid flood mapping and developed a mobile friendly web app that can easily be used in the field for data collection. Rapid automatic generation of SAR-based global flood maps calibrated with independent observations from

  10. Corrosion mapping in ducts using the automated ultrasonic technique C-Scan - correlation with results given by pig inspection; Mapeamento de corrosao em dutos atraves da tecnica ultrassonica C-Scan

    Energy Technology Data Exchange (ETDEWEB)

    Feres Filho, Pedro; Moura, Nestor Carlos de [Physical Acoustics South America (PASA), Sao Paulo, SP (Brazil)

    2005-07-01

    In-service inspection has received diverse contributions from technologies and documents with the objective of maximizing equipment availability and minimizing inadequate repairs. Amongst the available technologies, there are the automated ultrasound tests, in the B and C-scan versions. This paper describes an evaluation methodology based on the correlation between the test techniques of instrumented electromagnetic PIG and automated ultrasound, both applied with the purpose of detecting and mapping areas with corrosion in pipes for oil transport. The main objective of the application of the C-scan methodology, in this case, was the measuring and detailing of the corroded area, thus providing an adequate maintenance plan through the substitution or installation of a double gutter. The result demonstrates the correlation between the measurements taken by the PIG and the sizing of the regions done using the C-scan method, consisting of the length, width and thickness values in the points affected by the corrosion. (author)

  11. Automated mapping of mineral groups and green vegetation from Landsat Thematic Mapper imagery with an example from the San Juan Mountains, Colorado

    Science.gov (United States)

    Rockwell, Barnaby W.

    2013-01-01

    Multispectral satellite data acquired by the ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) and Landsat 7 Enhanced Thematic Mapper Plus (TM) sensors are being used to populate an online Geographic Information System (GIS) of the spatial occurrence of mineral groups and green vegetation across the western conterminous United States and Alaska. These geospatial data are supporting U.S. Geological Survey national-scale mineral deposit database development and other mineral resource and geoenvironmental research as a means of characterizing mineral exposures related to mined and unmined hydrothermally altered rocks and mine waste. This report introduces a new methodology for the automated analysis of Landsat TM data that has been applied to more than 180 scenes covering the western United States. A map of mineral groups and green vegetation produced using this new methodology that covers the western San Juan Mountains, Colorado, and the Four Corners Region is presented. The map is provided as a layered GeoPDF and in GIS-ready digital format. TM data analysis results from other well-studied and mineralogically characterized areas with strong hydrothermal alteration and (or) supergene weathering of near-surface sulfide minerals are also shown and compared with results derived from ASTER data analysis.

  12. Rapid, automated mosaicking of the human corneal subbasal nerve plexus.

    Science.gov (United States)

    Vaishnav, Yash J; Rucker, Stuart A; Saharia, Keshav; McNamara, Nancy A

    2017-11-27

    Corneal confocal microscopy (CCM) is an in vivo technique used to study corneal nerve morphology. The largest proportion of nerves innervating the cornea lie within the subbasal nerve plexus, where their morphology is altered by refractive surgery, diabetes and dry eye. The main limitations to clinical use of CCM as a diagnostic tool are the small field of view of CCM images and the lengthy time needed to quantify nerves in collected images. Here, we present a novel, rapid, fully automated technique to mosaic individual CCM images into wide-field maps of corneal nerves. We implemented an OpenCV image stitcher that accounts for corneal deformation and uses feature detection to stitch CCM images into a montage. The method takes 3-5 min to process and stitch 40-100 frames on an Amazon EC2 Micro instance. The speed, automation and ease of use conferred by this technique is the first step toward point of care evaluation of wide-field subbasal plexus (SBP) maps in a clinical setting.

  13. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  14. BikeMaps.org: A Global Tool for Collision and Near Miss Mapping.

    Science.gov (United States)

    Nelson, Trisalyn A; Denouden, Taylor; Jestico, Benjamin; Laberee, Karen; Winters, Meghan

    2015-01-01

    There are many public health benefits to cycling, such as chronic disease reduction and improved air quality. Real and perceived concerns about safety are primary barriers to new ridership. Due to limited forums for official reporting of cycling incidents, lack of comprehensive data is limiting our ability to study cycling safety and conduct surveillance. Our goal is to introduce BikeMaps.org, a new website developed by the authors for crowd-source mapping of cycling collisions and near misses. BikeMaps.org is a global mapping system that allows citizens to map locations of cycling incidents and report on the nature of the event. Attributes collected are designed for spatial modeling research on predictors of safety and risk, and to aid surveillance and planning. Released in October 2014, within 2 months the website had more than 14,000 visitors and mapping in 14 countries. Collisions represent 38% of reports (134/356) and near misses 62% (222/356). In our pilot city, Victoria, Canada, citizens mapped data equivalent to about 1 year of official cycling collision reports within 2 months via BikeMaps.org. Using report completeness as an indicator, early reports indicate that data are of high quality with 50% being fully attributed and another 10% having only one missing attribute. We are advancing this technology, with the development of a mobile App, improved data visualization, real-time altering of hazard reports, and automated open-source tools for data sharing. Researchers and citizens interested in utilizing the BikeMaps.org technology can get involved by encouraging citizen mapping in their region.

  15. aMAP is a validated pipeline for registration and segmentation of high-resolution mouse brain data

    Science.gov (United States)

    Niedworok, Christian J.; Brown, Alexander P. Y.; Jorge Cardoso, M.; Osten, Pavel; Ourselin, Sebastien; Modat, Marc; Margrie, Troy W.

    2016-01-01

    The validation of automated image registration and segmentation is crucial for accurate and reliable mapping of brain connectivity and function in three-dimensional (3D) data sets. While validation standards are necessarily high and routinely met in the clinical arena, they have to date been lacking for high-resolution microscopy data sets obtained from the rodent brain. Here we present a tool for optimized automated mouse atlas propagation (aMAP) based on clinical registration software (NiftyReg) for anatomical segmentation of high-resolution 3D fluorescence images of the adult mouse brain. We empirically evaluate aMAP as a method for registration and subsequent segmentation by validating it against the performance of expert human raters. This study therefore establishes a benchmark standard for mapping the molecular function and cellular connectivity of the rodent brain. PMID:27384127

  16. Fast semi-automated lesion demarcation in stroke

    Directory of Open Access Journals (Sweden)

    Bianca de Haan

    2015-01-01

    Full Text Available Lesion–behaviour mapping analyses require the demarcation of the brain lesion on each (usually transverse slice of the individual stroke patient's brain image. To date, this is generally thought to be most precise when done manually, which is, however, both time-consuming and potentially observer-dependent. Fully automated lesion demarcation methods have been developed to address these issues, but these are often not practicable in acute stroke research where for each patient only a single image modality is available and the available image modality differs over patients. In the current study, we evaluated a semi-automated lesion demarcation approach, the so-called Clusterize algorithm, in acute stroke patients scanned in a range of common image modalities. Our results suggest that, compared to the standard of manual lesion demarcation, the semi-automated Clusterize algorithm is capable of significantly speeding up lesion demarcation in the most commonly used image modalities, without loss of either lesion demarcation precision or lesion demarcation reproducibility. For the three investigated acute datasets (CT, DWI, T2FLAIR, containing a total of 44 patient images obtained in a regular clinical setting at patient admission, the reduction in processing time was on average 17.8 min per patient and this advantage increased with increasing lesion volume (up to 60 min per patient for the largest lesion volumes in our datasets. Additionally, our results suggest that performance of the Clusterize algorithm in a chronic dataset with 11 T1 images was comparable to its performance in the acute datasets. We thus advocate the use of the Clusterize algorithm, integrated into a simple, freely available SPM toolbox, for the precise, reliable and fast preparation of imaging data for lesion–behaviour mapping analyses.

  17. Automated landmark-guided deformable image registration.

    Science.gov (United States)

    Kearney, Vasant; Chen, Susie; Gu, Xuejun; Chiu, Tsuicheng; Liu, Honghuan; Jiang, Lan; Wang, Jing; Yordy, John; Nedzi, Lucien; Mao, Weihua

    2015-01-07

    The purpose of this work is to develop an automated landmark-guided deformable image registration (LDIR) algorithm between the planning CT and daily cone-beam CT (CBCT) with low image quality. This method uses an automated landmark generation algorithm in conjunction with a local small volume gradient matching search engine to map corresponding landmarks between the CBCT and the planning CT. The landmarks act as stabilizing control points in the following Demons deformable image registration. LDIR is implemented on graphics processing units (GPUs) for parallel computation to achieve ultra fast calculation. The accuracy of the LDIR algorithm has been evaluated on a synthetic case in the presence of different noise levels and data of six head and neck cancer patients. The results indicate that LDIR performed better than rigid registration, Demons, and intensity corrected Demons for all similarity metrics used. In conclusion, LDIR achieves high accuracy in the presence of multimodality intensity mismatch and CBCT noise contamination, while simultaneously preserving high computational efficiency.

  18. Automated landmark-guided deformable image registration

    International Nuclear Information System (INIS)

    Kearney, Vasant; Chen, Susie; Gu, Xuejun; Chiu, Tsuicheng; Liu, Honghuan; Jiang, Lan; Wang, Jing; Yordy, John; Nedzi, Lucien; Mao, Weihua

    2015-01-01

    The purpose of this work is to develop an automated landmark-guided deformable image registration (LDIR) algorithm between the planning CT and daily cone-beam CT (CBCT) with low image quality. This method uses an automated landmark generation algorithm in conjunction with a local small volume gradient matching search engine to map corresponding landmarks between the CBCT and the planning CT. The landmarks act as stabilizing control points in the following Demons deformable image registration. LDIR is implemented on graphics processing units (GPUs) for parallel computation to achieve ultra fast calculation. The accuracy of the LDIR algorithm has been evaluated on a synthetic case in the presence of different noise levels and data of six head and neck cancer patients. The results indicate that LDIR performed better than rigid registration, Demons, and intensity corrected Demons for all similarity metrics used. In conclusion, LDIR achieves high accuracy in the presence of multimodality intensity mismatch and CBCT noise contamination, while simultaneously preserving high computational efficiency. (paper)

  19. Automated T2 relaxometry of the hippocampus for temporal lobe epilepsy.

    Science.gov (United States)

    Winston, Gavin P; Vos, Sjoerd B; Burdett, Jane L; Cardoso, M Jorge; Ourselin, Sebastien; Duncan, John S

    2017-09-01

    Hippocampal sclerosis (HS), the most common cause of refractory temporal lobe epilepsy, is associated with hippocampal volume loss and increased T2 signal. These can be identified on quantitative imaging with hippocampal volumetry and T2 relaxometry. Although hippocampal segmentation for volumetry has been automated, T2 relaxometry currently involves subjective and time-consuming manual delineation of regions of interest. In this work, we develop and validate an automated technique for hippocampal T2 relaxometry. Fifty patients with unilateral or bilateral HS and 50 healthy controls underwent T 1 -weighted and dual-echo fast recovery fast spin echo scans. Hippocampi were automatically segmented using a multi-atlas-based segmentation algorithm (STEPS) and a template database. Voxelwise T2 maps were determined using a monoexponential fit. The hippocampal segmentations were registered to the T2 maps and eroded to reduce partial volume effect. Voxels with T2 >170 msec excluded to minimize cerebrospinal fluid (CSF) contamination. Manual determination of T2 values was performed twice in each subject. Twenty controls underwent repeat scans to assess interscan reproducibility. Hippocampal T2 values were reliably determined using the automated method. There was a significant ipsilateral increase in T2 values in HS (p epilepsy. © 2017 The Authors. Epilepsia published by Wiley Periodicals, Inc. on behalf of International League Against Epilepsy.

  20. Automated pulmonary lobar ventilation measurements using volume-matched thoracic CT and MRI

    Science.gov (United States)

    Guo, F.; Svenningsen, S.; Bluemke, E.; Rajchl, M.; Yuan, J.; Fenster, A.; Parraga, G.

    2015-03-01

    Objectives: To develop and evaluate an automated registration and segmentation pipeline for regional lobar pulmonary structure-function measurements, using volume-matched thoracic CT and MRI in order to guide therapy. Methods: Ten subjects underwent pulmonary function tests and volume-matched 1H and 3He MRI and thoracic CT during a single 2-hr visit. CT was registered to 1H MRI using an affine method that incorporated block-matching and this was followed by a deformable step using free-form deformation. The resultant deformation field was used to deform the associated CT lobe mask that was generated using commercial software. 3He-1H image registration used the same two-step registration method and 3He ventilation was segmented using hierarchical k-means clustering. Whole lung and lobar 3He ventilation and ventilation defect percent (VDP) were generated by mapping ventilation defects to CT-defined whole lung and lobe volumes. Target CT-3He registration accuracy was evaluated using region- , surface distance- and volume-based metrics. Automated whole lung and lobar VDP was compared with semi-automated and manual results using paired t-tests. Results: The proposed pipeline yielded regional spatial agreement of 88.0+/-0.9% and surface distance error of 3.9+/-0.5 mm. Automated and manual whole lung and lobar ventilation and VDP were not significantly different and they were significantly correlated (r = 0.77, p pulmonary structural-functional maps with high accuracy and robustness, providing an important tool for image-guided pulmonary interventions.

  1. Whole-brain activity mapping onto a zebrafish brain atlas

    Science.gov (United States)

    Randlett, Owen; Wee, Caroline L.; Naumann, Eva A.; Nnaemeka, Onyeka; Schoppik, David; Fitzgerald, James E.; Portugues, Ruben; Lacoste, Alix M.B.; Riegler, Clemens; Engert, Florian; Schier, Alexander F.

    2015-01-01

    In order to localize the neural circuits involved in generating behaviors, it is necessary to assign activity onto anatomical maps of the nervous system. Using brain registration across hundreds of larval zebrafish, we have built an expandable open source atlas containing molecular labels and anatomical region definitions, the Z-Brain. Using this platform and immunohistochemical detection of phosphorylated-Extracellular signal-regulated kinase (ERK/MAPK) as a readout of neural activity, we have developed a system to create and contextualize whole brain maps of stimulus- and behavior-dependent neural activity. This MAP-Mapping (Mitogen Activated Protein kinase – Mapping) assay is technically simple, fast, inexpensive, and data analysis is completely automated. Since MAP-Mapping is performed on fish that are freely swimming, it is applicable to nearly any stimulus or behavior. We demonstrate the utility of our high-throughput approach using hunting/feeding, pharmacological, visual and noxious stimuli. The resultant maps outline hundreds of areas associated with behaviors. PMID:26778924

  2. Automated high resolution mapping of coffee in Rwanda using an expert Bayesian network

    Science.gov (United States)

    Mukashema, A.; Veldkamp, A.; Vrieling, A.

    2014-12-01

    African highland agro-ecosystems are dominated by small-scale agricultural fields that often contain a mix of annual and perennial crops. This makes such systems difficult to map by remote sensing. We developed an expert Bayesian network model to extract the small-scale coffee fields of Rwanda from very high resolution data. The model was subsequently applied to aerial orthophotos covering more than 99% of Rwanda and on one QuickBird image for the remaining part. The method consists of a stepwise adjustment of pixel probabilities, which incorporates expert knowledge on size of coffee trees and fields, and on their location. The initial naive Bayesian network, which is a spectral-based classification, yielded a coffee map with an overall accuracy of around 50%. This confirms that standard spectral variables alone cannot accurately identify coffee fields from high resolution images. The combination of spectral and ancillary data (DEM and a forest map) allowed mapping of coffee fields and associated uncertainties with an overall accuracy of 87%. Aggregated to district units, the mapped coffee areas demonstrated a high correlation with the coffee areas reported in the detailed national coffee census of 2009 (R2 = 0.92). Unlike the census data our map provides high spatial resolution of coffee area patterns of Rwanda. The proposed method has potential for mapping other perennial small scale cropping systems in the East African Highlands and elsewhere.

  3. Automatically Generated Vegetation Density Maps with LiDAR Survey for Orienteering Purpose

    Science.gov (United States)

    Petrovič, Dušan

    2018-05-01

    The focus of our research was to automatically generate the most adequate vegetation density maps for orienteering purpose. Application Karttapullatuin was used for automated generation of vegetation density maps, which requires LiDAR data to process an automatically generated map. A part of the orienteering map in the area of Kazlje-Tomaj was used to compare the graphical display of vegetation density. With different settings of parameters in the Karttapullautin application we changed the way how vegetation density of automatically generated map was presented, and tried to match it as much as possible with the orienteering map of Kazlje-Tomaj. Comparing more created maps of vegetation density the most suitable parameter settings to automatically generate maps on other areas were proposed, too.

  4. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  5. Automated identification of crystallographic ligands using sparse-density representations

    International Nuclear Information System (INIS)

    Carolan, C. G.; Lamzin, V. S.

    2014-01-01

    A novel procedure for identifying ligands in macromolecular crystallographic electron-density maps is introduced. Density clusters in such maps can be rapidly attributed to one of 82 different ligands in an automated manner. A novel procedure for the automatic identification of ligands in macromolecular crystallographic electron-density maps is introduced. It is based on the sparse parameterization of density clusters and the matching of the pseudo-atomic grids thus created to conformationally variant ligands using mathematical descriptors of molecular shape, size and topology. In large-scale tests on experimental data derived from the Protein Data Bank, the procedure could quickly identify the deposited ligand within the top-ranked compounds from a database of candidates. This indicates the suitability of the method for the identification of binding entities in fragment-based drug screening and in model completion in macromolecular structure determination

  6. Characterizing Grain-Oriented Silicon Steel Sheet Using Automated High-Resolution Laue X-ray Diffraction

    Science.gov (United States)

    Lynch, Peter; Barnett, Matthew; Stevenson, Andrew; Hutchinson, Bevis

    2017-11-01

    Controlling texture in grain-oriented (GO) silicon steel sheet is critical for optimization of its magnetization performance. A new automated laboratory system, based on X-ray Laue diffraction, is introduced as a rapid method for large scale grain orientation mapping and texture measurement in these materials. Wide area grain orientation maps are demonstrated for both macroetched and coated GO steel sheets. The large secondary grains contain uniform lattice rotations, the origins of which are discussed.

  7. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Mariangela eIorio

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  8. Acoustic mapping of shallow water gas releases using shipborne multibeam systems

    Science.gov (United States)

    Urban, Peter; Köser, Kevin; Weiß, Tim; Greinert, Jens

    2015-04-01

    Water column imaging (WCI) shipborne multibeam systems are effective tools for investigating marine free gas (bubble) release. Like single- and splitbeam systems they are very sensitive towards gas bubbles in the water column, and have the advantage of the wide swath opening angle, 120° or more allowing a better mapping and possible 3D investigations of targets in the water column. On the downside, WCI data are degraded by specific noise from side-lobe effects and are usually not calibrated for target backscattering strength analysis. Most approaches so far concentrated on manual investigations of bubbles in the water column data. Such investigations allow the detection of bubble streams (flares) and make it possible to get an impression about the strength of detected flares/the gas release. Because of the subjective character of these investigations it is difficult to understand how well an area has been investigated by a flare mapping survey and subjective impressions about flare strength can easily be fooled by the many acoustic effects multibeam systems create. Here we present a semi-automated approach that uses the behavior of bubble streams in varying water currents to detect and map their exact source positions. The focus of the method is application of objective rules for flare detection, which makes it possible to extract information about the quality of the seepage mapping survey, perform automated noise reduction and create acoustic maps with quality discriminators indicating how well an area has been mapped.

  9. Regional Landslide Mapping Aided by Automated Classification of SqueeSAR™ Time Series (Northern Apennines, Italy)

    Science.gov (United States)

    Iannacone, J.; Berti, M.; Allievi, J.; Del Conte, S.; Corsini, A.

    2013-12-01

    Space borne InSAR has proven to be very valuable for landslides detection. In particular, extremely slow landslides (Cruden and Varnes, 1996) can be now clearly identified, thanks to the millimetric precision reached by recent multi-interferometric algorithms. The typical approach in radar interpretation for landslides mapping is based on average annual velocity of the deformation which is calculated over the entire times series. The Hotspot and Cluster Analysis (Lu et al., 2012) and the PSI-based matrix approach (Cigna et al., 2013) are examples of landslides mapping techniques based on average annual velocities. However, slope movements can be affected by non-linear deformation trends, (i.e. reactivation of dormant landslides, deceleration due to natural or man-made slope stabilization, seasonal activity, etc). Therefore, analyzing deformation time series is crucial in order to fully characterize slope dynamics. While this is relatively simple to be carried out manually when dealing with small dataset, the time series analysis over regional scale dataset requires automated classification procedures. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The analysis allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) which are likely to represent different slope processes. The analysis also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. All the classification algorithms were integrated into a Graphical User Interface called PSTime. We investigated an area of about 2000 km2 in the Northern Apennines of Italy by using SqueeSAR™ algorithm (Ferretti et al., 2011). Two Radarsat-1 data stack, comprising of 112 scenes in descending orbit and 124 scenes in ascending orbit

  10. Draw-in Map - A Road Map for Simulation-Guided Die Tryout and Stamping Process Control

    International Nuclear Information System (INIS)

    Wang Chuantao; Zhang, Jimmy J.; Goan, Norman

    2005-01-01

    Sheet metal forming is a displacement or draw-in controlled manufacturing process in which a flat blank is drawn into die cavity to form an automotive body panel. Draw-in amount is the single most important stamping manufacturing index that controls all forming characteristics (strains, stresses, thinning, etc.), stamping failures (splits, wrinkles, surface distortion, etc.) and line die operations and automations. Draw-in Map is engineered for math-based die developments via advanced stamping simulation technology. Then the Draw-in Map is provided to die makers in plants as a road map for math-guided die tryout in which the die tryout workers follow the engineered tryout conditions and matches the engineered draw-in amount so that the tryout time and cost are greatly reduced, and quality is ensured. The Map can also be used as a math-based trouble-shooting tool to identify the causes of formability problems in stamping production. The engineered Draw-in Map has been applied to all draw die tryout for all GM vehicle programs since 1998. A minimum 50% reduction in both lead-time and cost and significant improvement in panel quality in tryout have been reported. This paper presents the concept and process to apply the engineered Draw-in Map in die tryout

  11. Brain-wide mapping of axonal connections: workflow for automated detection and spatial analysis of labeling in microscopic sections

    Directory of Open Access Journals (Sweden)

    Eszter Agnes ePapp

    2016-04-01

    Full Text Available Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA and Phaseolus vulgaris leucoagglutinin (Pha-L allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS atlas of the Sprague Dawley rat brain (v2 by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data.

  12. Object-based analysis of multispectral airborne laser scanner data for land cover classification and map updating

    Science.gov (United States)

    Matikainen, Leena; Karila, Kirsi; Hyyppä, Juha; Litkey, Paula; Puttonen, Eetu; Ahokas, Eero

    2017-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with passive multispectral information from aerial images, has shown its high feasibility for automated mapping processes. The main benefits have been achieved in the mapping of elevated objects such as buildings and trees. Recently, the first multispectral airborne laser scanners have been launched, and active multispectral information is for the first time available for 3D ALS point clouds from a single sensor. This article discusses the potential of this new technology in map updating, especially in automated object-based land cover classification and change detection in a suburban area. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from an object-based random forests analysis suggest that the multispectral ALS data are very useful for land cover classification, considering both elevated classes and ground-level classes. The overall accuracy of the land cover classification results with six classes was 96% compared with validation points. The classes under study included building, tree, asphalt, gravel, rocky area and low vegetation. Compared to classification of single-channel data, the main improvements were achieved for ground-level classes. According to feature importance analyses, multispectral intensity features based on several channels were more useful than those based on one channel. Automatic change detection for buildings and roads was also demonstrated by utilising the new multispectral ALS data in combination with old map vectors. In change detection of buildings, an old digital surface model (DSM) based on single-channel ALS data was also used. Overall, our analyses suggest that the new data have high potential for further increasing the automation level in mapping. Unlike passive aerial imaging commonly used in mapping, the multispectral ALS technology is independent of external illumination conditions, and there are

  13. Whole-brain activity mapping onto a zebrafish brain atlas.

    Science.gov (United States)

    Randlett, Owen; Wee, Caroline L; Naumann, Eva A; Nnaemeka, Onyeka; Schoppik, David; Fitzgerald, James E; Portugues, Ruben; Lacoste, Alix M B; Riegler, Clemens; Engert, Florian; Schier, Alexander F

    2015-11-01

    In order to localize the neural circuits involved in generating behaviors, it is necessary to assign activity onto anatomical maps of the nervous system. Using brain registration across hundreds of larval zebrafish, we have built an expandable open-source atlas containing molecular labels and definitions of anatomical regions, the Z-Brain. Using this platform and immunohistochemical detection of phosphorylated extracellular signal–regulated kinase (ERK) as a readout of neural activity, we have developed a system to create and contextualize whole-brain maps of stimulus- and behavior-dependent neural activity. This mitogen-activated protein kinase (MAP)-mapping assay is technically simple, and data analysis is completely automated. Because MAP-mapping is performed on freely swimming fish, it is applicable to studies of nearly any stimulus or behavior. Here we demonstrate our high-throughput approach using pharmacological, visual and noxious stimuli, as well as hunting and feeding. The resultant maps outline hundreds of areas associated with behaviors.

  14. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  15. Automated Seat Cushion for Pressure Ulcer Prevention Using Real-Time Mapping, Offloading, and Redistribution of Interface Pressure

    Science.gov (United States)

    2016-10-01

    of MATLAB image processing routines to identify, sort, and track the location of the intended points (markers), Fig. 12(c). Further, magnification...testing as well as finite element simulation . Automation and control testing has been completed on a 5x5 array of bubble actuators to verify pressure...ulcer prevention, automated seat cushion, bubble actuator, pressure modulation , pressure offloading, wheelchair cushion, spinal cord injury 16. SECURITY

  16. Identification and Mapping of Simple Sequence Repeat Markers from Common Bean (Phaseolus vulgaris L. Bacterial Artificial Chromosome End Sequences for Genome Characterization and Genetic–Physical Map Integration

    Directory of Open Access Journals (Sweden)

    Juana M. Córdoba

    2010-11-01

    Full Text Available Microsatellite markers or simple sequence repeat (SSR loci are useful for diversity characterization and genetic–physical mapping. Different in silico microsatellite search methods have been developed for mining bacterial artificial chromosome (BAC end sequences for SSRs. The overall goal of this study was genome characterization based on SSRs in 89,017 BAC end sequences (BESs from the G19833 common bean ( L. library. Another objective was to identify new SSR taking into account three tandem motif identification programs (Automated Microsatellite Marker Development [AMMD], Tandem Repeats Finder [TRF], and SSRLocator [SSRL]. Among the microsatellite search engines, SSRL identified the highest number of SSRs; however, when primer design was attempted, the number dropped due to poor primer design regions. Automated Microsatellite Marker Development software identified many SSRs with valuable AT/TA or AG/TC motifs, while TRF found fewer SSRs and produced no primers. A subgroup of 323 AT-rich, di-, and trinucleotide SSRs were selected from the AMMD results and used in a parental survey with DOR364 and G19833, of which 75 could be mapped in the corresponding population; these represented 4052 BAC clones. Together with 92 previously mapped BES- and 114 non-BES-derived markers, a total of 280 SSRs were included in the polymerase chain reaction (PCR-based map, integrating a total of 8232 BAC clones in 162 contigs from the physical map.

  17. Innovative technology summary report: Mobile automated characterization system

    International Nuclear Information System (INIS)

    1999-01-01

    The key results of the Mobile Automated Characterization System (MACS) demonstration are that MACS' greatest application would be in large open areas that would need to be surveyed repeatedly. In addition, the color graphic capability of the MACS to illustrate contamination locations is one of the system's greatest assets. It is easier to visually identify contaminated areas by looking at the color maps than by scanning through pages of coordinate survey data. The color map provides all of the data obtained on one page for easy reference. A technician is less likely to miss something by utilizing the color map. MACS is very suitable for repetitive Surveillance and Maintenance applications. MACS does not require a full time operator. It can be pre-programmed to conduct surveys. Based on the demonstration, MACS will have to improve its reliability to take full advantage of its capabilities. Downtime was experienced during the demonstration due to numerous survey and hardware errors

  18. Cerebral perfusion and automated individual analysis using SPECT among an obsessive-compulsive population

    Directory of Open Access Journals (Sweden)

    Euclides Timóteo da Rocha

    2011-01-01

    Full Text Available OBJECTIVE: To make individual assessments using automated quantification methodology in order to screen for perfusion abnormalities in cerebral SPECT examinations among a sample of subjects with OCD. METHODS: Statistical parametric mapping (SPM was used to compare 26 brain SPECT images from patients with OCD individually with an image bank of 32 normal subjects, using the statistical threshold of p < 0.05 (corrected for multiple comparisons at the level of individual voxels or clusters. The maps were analyzed, and regions presenting voxels that remained above this threshold were sought. RESULTS: Six patients from a sample of 26 OCD images showed abnormalities at cluster or voxel level, considering the criteria described above, which represented 23.07%. However, seven images from the normal group of 32 were also indicated as cases of perfusional abnormality, representing 21.8% of the sample. CONCLUSION: The automated quantification method was not considered to be a useful tool for clinical practice, for analyses complementary to visual inspection.

  19. Spatiotemporal High-Resolution Cloud Mapping with a Ground-Based IR Scanner

    NARCIS (Netherlands)

    Brede, Benjamin; Thies, Boris; Bendix, Jörg; Feister, Uwe

    2017-01-01

    The high spatiotemporal variability of clouds requires automated monitoring systems. This study presents a retrieval algorithm that evaluates observations of a hemispherically scanning thermal infrared radiometer, the NubiScope, to produce georeferenced, spatially explicit cloud maps. The algorithm

  20. COMPARING IMAGE-BASED METHODS FOR ASSESSING VISUAL CLUTTER IN GENERALIZED MAPS

    Directory of Open Access Journals (Sweden)

    G. Touya

    2015-08-01

    Full Text Available Map generalization abstracts and simplifies geographic information to derive maps at smaller scales. The automation of map generalization requires techniques to evaluate the global quality of a generalized map. The quality and legibility of a generalized map is related to the complexity of the map, or the amount of clutter in the map, i.e. the excessive amount of information and its disorganization. Computer vision research is highly interested in measuring clutter in images, and this paper proposes to compare some of the existing techniques from computer vision, applied to generalized maps evaluation. Four techniques from the literature are described and tested on a large set of maps, generalized at different scales: edge density, subband entropy, quad tree complexity, and segmentation clutter. The results are analyzed against several criteria related to generalized maps, the identification of cluttered areas, the preservation of the global amount of information, the handling of occlusions and overlaps, foreground vs background, and blank space reduction.

  1. US Topo Maps 2014: Program updates and research

    Science.gov (United States)

    Fishburn, Kristin A.

    2014-01-01

    The U. S. Geological Survey (USGS) US Topo map program is now in year two of its second three-year update cycle. Since the program was launched in 2009, the product and the production system tools and processes have undergone enhancements that have made the US Topo maps a popular success story. Research and development continues with structural and content product enhancements, streamlined and more fully automated workflows, and the evaluation of a GIS-friendly US Topo GIS Packet. In addition, change detection methodologies are under evaluation to further streamline product maintenance and minimize resource expenditures for production in the future. The US Topo map program will continue to evolve in the years to come, providing traditional map users and Geographic Information System (GIS) analysts alike with a convenient, freely available product incorporating nationally consistent data that are quality assured to high standards.

  2. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  3. Application of FLIA to the evaluation of newly incorporated control panel. 2. Determination of balance of manipulated/automated phase by cluster analysis

    International Nuclear Information System (INIS)

    Sano, Norihide; Takahashi, Ryoichi.

    1996-01-01

    Human reliability in a complex system has been studied to establish safety systems by analyzing the operator's performance in a control room of a nuclear power plant. In this paper, results of a mathematical model and a questionnaire given to plant designers and operators led to the proposal of a fuzzy tool for evaluating the quality of recent automated control systems. The first report described a method which is capable of calculating human performance by summing the weighted utility of attributes. The modified fuzzy measures learning identification algorithm (FLIA) reduces a set of attributes until human tasks are represented clearly. A change in the performance is illustrated on a two-dimensional map of the dominant attributes as a function of the automated level. The designers and the operators determined the balance of the manipulated/automated phase on the map after careful individual interviews. In the present paper, we attempt to interpret the boundary with a cluster-analysis theory, where the Euclidian square distance and the nearest-neighbor method are applied. The evaluated aspect of the boundary on the map can be divided into the manipulated/automated phase. It is shown that the calculated boundary is equal to the vertical bisector between the center of gravity of the clusters. The analytical boundary agrees precisely with the questionnaire result. (author)

  4. Automated damage test facilities for materials development and production optic quality assurance at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Battersby, C.; Dickson, R.; Jennings, R.; Kimmons, J.; Kozlowski, M. R.; Maricle, S.; Mouser, R.; Runkel, M.; Schwartz, S.; Sheehan, L. M.; Weinzapfel, C.

    1998-01-01

    The Laser Program at LLNL has developed automated facilities for damage testing optics up to 1 meter in diameter. The systems were developed to characterize the statistical distribution of localized damage performance across large-aperture National Ignition Facility optics. Full aperture testing is a key component of the quality assurance program for several of the optical components. The primary damage testing methods used are R:1 mapping and raster scanning. Automation of these test methods was required to meet the optics manufacturing schedule. The automated activities include control and diagnosis of the damage-test laser beam as well as detection and characterization of damage events

  5. Expert-driven semi-automated geomorphological mapping for a mountainaous area using a laser DTM

    NARCIS (Netherlands)

    van Asselen, S.; Seijmonsbergen, A.C.

    2006-01-01

    n this paper a semi-automated method is presented to recognize and spatially delineate geomorphological units in mountainous forested ecosystems, using statistical information extracted from a 1-m resolution laser digital elevation dataset. The method was applied to a mountainous area in Austria.

  6. Towards Automated Reasoning on ORM Schemes

    Science.gov (United States)

    Jarrar, Mustafa

    The goal of this article is to formalize Object Role Modeling (ORM) using the {DLR} description logic. This would enable automated reasoning on the formal properties of ORM diagrams, such as detecting constraint contradictions and implications. In addition, the expressive, methodological, and graphical capabilities of ORM make it a good candidate for use as a graphical notation for most description logic languages. In this way, industrial experts who are not IT savvy will still be able to build and view axiomatized theories (such as ontologies, business rules, etc.) without needing to know the logic or reasoning foundations underpinning them. Our formalization in this paper is structured as 29 formalization rules, that map all ORM primitives and constraints into {DLR}, and 2 exceptions of complex cases. To this end, we illustrate the implementation of our formalization as an extension to DogmaModeler, which automatically maps ORM into DIG and uses Racer as a background reasoning engine to reason about ORM diagrams.

  7. Automated method for measuring the extent of selective logging damage with airborne LiDAR data

    Science.gov (United States)

    Melendy, L.; Hagen, S. C.; Sullivan, F. B.; Pearson, T. R. H.; Walker, S. M.; Ellis, P.; Kustiyo; Sambodo, Ari Katmoko; Roswintiarti, O.; Hanson, M. A.; Klassen, A. W.; Palace, M. W.; Braswell, B. H.; Delgado, G. M.

    2018-05-01

    Selective logging has an impact on the global carbon cycle, as well as on the forest micro-climate, and longer-term changes in erosion, soil and nutrient cycling, and fire susceptibility. Our ability to quantify these impacts is dependent on methods and tools that accurately identify the extent and features of logging activity. LiDAR-based measurements of these features offers significant promise. Here, we present a set of algorithms for automated detection and mapping of critical features associated with logging - roads/decks, skid trails, and gaps - using commercial airborne LiDAR data as input. The automated algorithm was applied to commercial LiDAR data collected over two logging concessions in Kalimantan, Indonesia in 2014. The algorithm results were compared to measurements of the logging features collected in the field soon after logging was complete. The automated algorithm-mapped road/deck and skid trail features match closely with features measured in the field, with agreement levels ranging from 69% to 99% when adjusting for GPS location error. The algorithm performed most poorly with gaps, which, by their nature, are variable due to the unpredictable impact of tree fall versus the linear and regular features directly created by mechanical means. Overall, the automated algorithm performs well and offers significant promise as a generalizable tool useful to efficiently and accurately capture the effects of selective logging, including the potential to distinguish reduced impact logging from conventional logging.

  8. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  9. Integration of linkage maps for the Amphidiploid Brassica napus and comparative mapping with Arabidopsis and Brassica rapa

    Directory of Open Access Journals (Sweden)

    Delourme Régine

    2011-02-01

    Full Text Available Abstract Background The large number of genetic linkage maps representing Brassica chromosomes constitute a potential platform for studying crop traits and genome evolution within Brassicaceae. However, the alignment of existing maps remains a major challenge. The integration of these genetic maps will enhance genetic resolution, and provide a means to navigate between sequence-tagged loci, and with contiguous genome sequences as these become available. Results We report the first genome-wide integration of Brassica maps based on an automated pipeline which involved collation of genome-wide genotype data for sequence-tagged markers scored on three extensively used amphidiploid Brassica napus (2n = 38 populations. Representative markers were selected from consolidated maps for each population, and skeleton bin maps were generated. The skeleton maps for the three populations were then combined to generate an integrated map for each LG, comparing two different approaches, one encapsulated in JoinMap and the other in MergeMap. The BnaWAIT_01_2010a integrated genetic map was generated using JoinMap, and includes 5,162 genetic markers mapped onto 2,196 loci, with a total genetic length of 1,792 cM. The map density of one locus every 0.82 cM, corresponding to 515 Kbp, increases by at least three-fold the locus and marker density within the original maps. Within the B. napus integrated map we identified 103 conserved collinearity blocks relative to Arabidopsis, including five previously unreported blocks. The BnaWAIT_01_2010a map was used to investigate the integrity and conservation of order proposed for genome sequence scaffolds generated from the constituent A genome of Brassica rapa. Conclusions Our results provide a comprehensive genetic integration of the B. napus genome from a range of sources, which we anticipate will provide valuable information for rapeseed and Canola research.

  10. Autonomous Driving in the iCity—HD Maps as a Key Challenge of the Automotive Industry

    Directory of Open Access Journals (Sweden)

    Heiko G. Seif

    2016-06-01

    Full Text Available This article provides in-depth insights into the necessary technologies for automated driving in future cities. State of science is reflected from different perspectives such as in-car computing and data management, road side infrastructure, and cloud solutions. Especially the challenges for the application of HD maps as core technology for automated driving are depicted in this article.

  11. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  12. Automated genotyping of dinucleotide repeat markers

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Hoffman, E.P. [Carnegie Mellon Univ., Pittsburgh, PA (United States)]|[Univ. of Pittsburgh, PA (United States)

    1994-09-01

    The dinucleotide repeats (i.e., microsatellites) such as CA-repeats are a highly polymorphic, highly abundant class of PCR-amplifiable markers that have greatly streamlined genetic mapping experimentation. It is expected that over 30,000 such markers (including tri- and tetranucleotide repeats) will be characterized for routine use in the next few years. Since only size determination, and not sequencing, is required to determine alleles, in principle, dinucleotide repeat genotyping is easily performed on electrophoretic gels, and can be automated using DNA sequencers. Unfortunately, PCR stuttering with these markers generates not one band for each allele, but a pattern of bands. Since closely spaced alleles must be disambiguated by human scoring, this poses a key obstacle to full automation. We have developed methods that overcome this obstacle. Our model is that the observed data is generated by arithmetic superposition (i.e., convolution) of multiple allele patterns. By quantitatively measuring the size of each component band, and exploiting the unique stutter pattern associated with each marker, closely spaced alleles can be deconvolved; this unambiguously reconstructs the {open_quotes}true{close_quotes} allele bands, with stutter artifact removed. We used this approach in a system for automated diagnosis of (X-linked) Duchenne muscular dystrophy; four multiplexed CA-repeats within the dystrophin gene were assayed on a DNA sequencer. Our method accurately detected small variations in gel migration that shifted the allele size estimate. In 167 nonmutated alleles, 89% (149/167) showed no size variation, 9% (15/167) showed 1 bp variation, and 2% (3/167) showed 2 bp variation. We are currently developing a library of dinucleotide repeat patterns; together with our deconvolution methods, this library will enable fully automated genotyping of dinucleotide repeats from sizing data.

  13. Generating Clustered Journal Maps : An Automated System for Hierarchical Classification

    NARCIS (Netherlands)

    Leydesdorff, L.; Bornmann, L.; Wagner, C.S.

    2017-01-01

    Journal maps and classifications for 11,359 journals listed in the combined Journal Citation Reports 2015 of the Science and Social Sciences Citation Indexes are provided at https://leydesdorff.github.io/journals/ and http://www.leydesdorff.net/jcr15. A routine using VOSviewer for integrating the

  14. Automating Ontological Annotation with WordNet

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob L.; Hohimer, Ryan E.; White, Amanda M.

    2006-01-22

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  15. SoilGrids1km — Global Soil Information Based on Automated Mapping

    Science.gov (United States)

    Hengl, Tomislav; de Jesus, Jorge Mendes; MacMillan, Robert A.; Batjes, Niels H.; Heuvelink, Gerard B. M.; Ribeiro, Eloi; Samuel-Rosa, Alessandro; Kempen, Bas; Leenaars, Johan G. B.; Walsh, Markus G.; Gonzalez, Maria Ruiperez

    2014-01-01

    Background Soils are widely recognized as a non-renewable natural resource and as biophysical carbon sinks. As such, there is a growing requirement for global soil information. Although several global soil information systems already exist, these tend to suffer from inconsistencies and limited spatial detail. Methodology/Principal Findings We present SoilGrids1km — a global 3D soil information system at 1 km resolution — containing spatial predictions for a selection of soil properties (at six standard depths): soil organic carbon (g kg−1), soil pH, sand, silt and clay fractions (%), bulk density (kg m−3), cation-exchange capacity (cmol+/kg), coarse fragments (%), soil organic carbon stock (t ha−1), depth to bedrock (cm), World Reference Base soil groups, and USDA Soil Taxonomy suborders. Our predictions are based on global spatial prediction models which we fitted, per soil variable, using a compilation of major international soil profile databases (ca. 110,000 soil profiles), and a selection of ca. 75 global environmental covariates representing soil forming factors. Results of regression modeling indicate that the most useful covariates for modeling soils at the global scale are climatic and biomass indices (based on MODIS images), lithology, and taxonomic mapping units derived from conventional soil survey (Harmonized World Soil Database). Prediction accuracies assessed using 5–fold cross-validation were between 23–51%. Conclusions/Significance SoilGrids1km provide an initial set of examples of soil spatial data for input into global models at a resolution and consistency not previously available. Some of the main limitations of the current version of SoilGrids1km are: (1) weak relationships between soil properties/classes and explanatory variables due to scale mismatches, (2) difficulty to obtain covariates that capture soil forming factors, (3) low sampling density and spatial clustering of soil profile locations. However, as the Soil

  16. Intro to Google Maps and Google Earth

    Directory of Open Access Journals (Sweden)

    Jim Clifford

    2013-12-01

    Full Text Available Google My Maps and Google Earth provide an easy way to start creating digital maps. With a Google Account you can create and edit personal maps by clicking on My Places. In My Maps you can choose between several different base maps (including the standard satellite, terrain, or standard maps and add points, lines and polygons. It is also possible to import data from a spreadsheet, if you have columns with geographical information (i.e. longitudes and latitudes or place names. This automates a formerly complex task known as geocoding. Not only is this one of the easiest ways to begin plotting your historical data on a map, but it also has the power of Google’s search engine. As you read about unfamiliar places in historical documents, journal articles or books, you can search for them using Google Maps. It is then possible to mark numerous locations and explore how they relate to each other geographically. Your personal maps are saved by Google (in their cloud, meaning you can access them from any computer with an internet connection. You can keep them private or embed them in your website or blog. Finally, you can export your points, lines, and polygons as KML files and open them in Google Earth or Quantum GIS.

  17. Automated peroperative assessment of stents apposition from OCT pullbacks.

    Science.gov (United States)

    Dubuisson, Florian; Péry, Emilie; Ouchchane, Lemlih; Combaret, Nicolas; Kauffmann, Claude; Souteyrand, Géraud; Motreff, Pascal; Sarry, Laurent

    2015-04-01

    This study's aim was to control the stents apposition by automatically analyzing endovascular optical coherence tomography (OCT) sequences. Lumen is detected using threshold, morphological and gradient operators to run a Dijkstra algorithm. Wrong detection tagged by the user and caused by bifurcation, struts'presence, thrombotic lesions or dissections can be corrected using a morphing algorithm. Struts are also segmented by computing symmetrical and morphological operators. Euclidian distance between detected struts and wall artery initializes a stent's complete distance map and missing data are interpolated with thin-plate spline functions. Rejection of detected outliers, regularization of parameters by generalized cross-validation and using the one-side cyclic property of the map also optimize accuracy. Several indices computed from the map provide quantitative values of malapposition. Algorithm was run on four in-vivo OCT sequences including different incomplete stent apposition's cases. Comparison with manual expert measurements validates the segmentation׳s accuracy and shows an almost perfect concordance of automated results. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Current trends in satellite based emergency mapping - the need for harmonisation

    Science.gov (United States)

    Voigt, Stefan

    2013-04-01

    During the past years, the availability and use of satellite image data to support disaster management and humanitarian relief organisations has largely increased. The automation and data processing techniques are greatly improving as well as the capacity in accessing and processing satellite imagery in getting better globally. More and more global activities via the internet and through global organisations like the United Nations or the International Charter Space and Major Disaster engage in the topic, while at the same time, more and more national or local centres engage rapid mapping operations and activities. In order to make even more effective use of this very positive increase of capacity, for the sake of operational provision of analysis results, for fast validation of satellite derived damage assessments, for better cooperation in the joint inter agency generation of rapid mapping products and for general scientific use, rapid mapping results in general need to be better harmonized, if not even standardized. In this presentation, experiences from various years of rapid mapping gained by the DLR Center for satellite based Crisis Information (ZKI) within the context of the national activities, the International Charter Space and Major Disasters, GMES/Copernicus etc. are reported. Furthermore, an overview on how automation, quality assurance and optimization can be achieved through standard operation procedures within a rapid mapping workflow is given. Building on this long term rapid mapping experience, and building on the DLR initiative to set in pace an "International Working Group on Satellite Based Emergency Mapping" current trends in rapid mapping are discussed and thoughts on how the sharing of rapid mapping information can be optimized by harmonizing analysis results and data structures are presented. Such an harmonization of analysis procedures, nomenclatures and representations of data as well as meta data are the basis to better cooperate within

  19. Analysis of car shredder polymer waste with Raman mapping and chemometrics

    Directory of Open Access Journals (Sweden)

    B. Vajna

    2012-02-01

    Full Text Available A novel evaluation method was developed for Raman microscopic quantitative characterization of polymer waste. Car shredder polymer waste was divided into different density fractions by magnetic density separation (MDS technique, and each fraction was investigated by Raman mapping, which is capable of detecting the components being present even in low concentration. The only method available for evaluation of the mapping results was earlier to assign each pixel to a component visually and to count the number of different polymers on the Raman map. An automated method is proposed here for pixel classification, which helps to detect the different polymers present and enables rapid assignment of each pixel to the appropriate polymer. Six chemometric methods were tested to provide a basis for the pixel classification, among which multivariate curve resolution-alternating least squares (MCR-ALS provided the best results. The MCR-ALS based pixel identification method was then used for the quantitative characterization of each waste density fraction, where it was found that the automated method yields accurate results in a very short time, as opposed to manual pixel counting method which may take hours of human work per dataset.

  20. Complex Genetics of Behavior: BXDs in the Automated Home-Cage.

    Science.gov (United States)

    Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B

    2017-01-01

    This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.

  1. PIXE-quantified AXSIA: Elemental mapping by multivariate spectral analysis

    International Nuclear Information System (INIS)

    Doyle, B.L.; Provencio, P.P.; Kotula, P.G.; Antolak, A.J.; Ryan, C.G.; Campbell, J.L.; Barrett, K.

    2006-01-01

    Automated, nonbiased, multivariate statistical analysis techniques are useful for converting very large amounts of data into a smaller, more manageable number of chemical components (spectra and images) that are needed to describe the measurement. We report the first use of the multivariate spectral analysis program AXSIA (Automated eXpert Spectral Image Analysis) developed at Sandia National Laboratories to quantitatively analyze micro-PIXE data maps. AXSIA implements a multivariate curve resolution technique that reduces the spectral image data sets into a limited number of physically realizable and easily interpretable components (including both spectra and images). We show that the principal component spectra can be further analyzed using conventional PIXE programs to convert the weighting images into quantitative concentration maps. A common elemental data set has been analyzed using three different PIXE analysis codes and the results compared to the cases when each of these codes is used to separately analyze the associated AXSIA principal component spectral data. We find that these comparisons are in good quantitative agreement with each other

  2. An open-source java platform for automated reaction mapping.

    Science.gov (United States)

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  3. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  4. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  5. Pseudo natural colour aerial imagery for urban and suburban mapping

    DEFF Research Database (Denmark)

    Knudsen, Thomas

    2005-01-01

    Due to their near-infrared data channel, digital airborne four-channel imagers provide a potentially good discrimination between vegetation and human-made materials, which is very useful in automated mapping. Due to their red, green and blue data channels, they also provide natural colour images......, which are very useful in traditional (manual) mapping. In this paper, an algorithm is described which provides an approximation to the spectral capabilities of the four-channel imagers by using a colour-infrared aerial photo as input. The algorithm is tailored to urban/suburban surroundings, where...... the quality of the generated (pseudo) natural colour images are fully acceptable for manual mapping. This brings the combined availability of near-infrared and (pseudo) natural colours within reach for mapping projects based on traditional photogrammetry, which is valuable since traditional analytical cameras...

  6. MareyMap Online: A User-Friendly Web Application and Database Service for Estimating Recombination Rates Using Physical and Genetic Maps.

    Science.gov (United States)

    Siberchicot, Aurélie; Bessy, Adrien; Guéguen, Laurent; Marais, Gabriel A B

    2017-10-01

    Given the importance of meiotic recombination in biology, there is a need to develop robust methods to estimate meiotic recombination rates. A popular approach, called the Marey map approach, relies on comparing genetic and physical maps of a chromosome to estimate local recombination rates. In the past, we have implemented this approach in an R package called MareyMap, which includes many functionalities useful to get reliable recombination rate estimates in a semi-automated way. MareyMap has been used repeatedly in studies looking at the effect of recombination on genome evolution. Here, we propose a simpler user-friendly web service version of MareyMap, called MareyMap Online, which allows a user to get recombination rates from her/his own data or from a publicly available database that we offer in a few clicks. When the analysis is done, the user is asked whether her/his curated data can be placed in the database and shared with other users, which we hope will make meta-analysis on recombination rates including many species easy in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  7. Towards an EO-based Landslide Web Mapping and Monitoring Service

    Science.gov (United States)

    Hölbling, Daniel; Weinke, Elisabeth; Albrecht, Florian; Eisank, Clemens; Vecchiotti, Filippo; Friedl, Barbara; Kociu, Arben

    2017-04-01

    National and regional authorities and infrastructure maintainers in mountainous regions require accurate knowledge of the location and spatial extent of landslides for hazard and risk management. Information on landslides is often collected by a combination of ground surveying and manual image interpretation following landslide triggering events. However, the high workload and limited time for data acquisition result in a trade-off between completeness, accuracy and detail. Remote sensing data offers great potential for mapping and monitoring landslides in a fast and efficient manner. While facing an increased availability of high-quality Earth Observation (EO) data and new computational methods, there is still a lack in science-policy interaction and in providing innovative tools and methods that can easily be used by stakeholders and users to support their daily work. Taking up this issue, we introduce an innovative and user-oriented EO-based web service for landslide mapping and monitoring. Three central design components of the service are presented: (1) the user requirements definition, (2) the semi-automated image analysis methods implemented in the service, and (3) the web mapping application with its responsive user interface. User requirements were gathered during semi-structured interviews with regional authorities. The potential users were asked if and how they employ remote sensing data for landslide investigation and what their expectations to a landslide web mapping service regarding reliability and usability are. The interviews revealed the capability of our service for landslide documentation and mapping as well as monitoring of selected landslide sites, for example to complete and update landslide inventory maps. In addition, the users see a considerable potential for landslide rapid mapping. The user requirements analysis served as basis for the service concept definition. Optical satellite imagery from different high resolution (HR) and very high

  8. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  9. Detection of myocardial ischemia by automated, motion-corrected, color-encoded perfusion maps compared with visual analysis of adenosine stress cardiovascular magnetic resonance imaging at 3 T: a pilot study.

    Science.gov (United States)

    Doesch, Christina; Papavassiliu, Theano; Michaely, Henrik J; Attenberger, Ulrike I; Glielmi, Christopher; Süselbeck, Tim; Fink, Christian; Borggrefe, Martin; Schoenberg, Stefan O

    2013-09-01

    The purpose of this study was to compare automated, motion-corrected, color-encoded (AMC) perfusion maps with qualitative visual analysis of adenosine stress cardiovascular magnetic resonance imaging for detection of flow-limiting stenoses. Myocardial perfusion measurements applying the standard adenosine stress imaging protocol and a saturation-recovery temporal generalized autocalibrating partially parallel acquisition (t-GRAPPA) turbo fast low angle shot (Turbo FLASH) magnetic resonance imaging sequence were performed in 25 patients using a 3.0-T MAGNETOM Skyra (Siemens Healthcare Sector, Erlangen, Germany). Perfusion studies were analyzed using AMC perfusion maps and qualitative visual analysis. Angiographically detected coronary artery (CA) stenoses greater than 75% or 50% or more with a myocardial perfusion reserve index less than 1.5 were considered as hemodynamically relevant. Diagnostic performance and time requirement for both methods were compared. Interobserver and intraobserver reliability were also assessed. A total of 29 CA stenoses were included in the analysis. Sensitivity, specificity, positive predictive value, negative predictive value, and accuracy for detection of ischemia on a per-patient basis were comparable using the AMC perfusion maps compared to visual analysis. On a per-CA territory basis, the attribution of an ischemia to the respective vessel was facilitated using the AMC perfusion maps. Interobserver and intraobserver reliability were better for the AMC perfusion maps (concordance correlation coefficient, 0.94 and 0.93, respectively) compared to visual analysis (concordance correlation coefficient, 0.73 and 0.79, respectively). In addition, in comparison to visual analysis, the AMC perfusion maps were able to significantly reduce analysis time from 7.7 (3.1) to 3.2 (1.9) minutes (P < 0.0001). The AMC perfusion maps yielded a diagnostic performance on a per-patient and on a per-CA territory basis comparable with the visual analysis

  10. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  11. Spectral matching techniques (SMTs) and automated cropland classification algorithms (ACCAs) for mapping croplands of Australia using MODIS 250-m time-series (2000–2015) data

    Science.gov (United States)

    Teluguntla, Pardhasaradhi G.; Thenkabail, Prasad S.; Xiong, Jun N.; Gumma, Murali Krishna; Congalton, Russell G.; Oliphant, Adam; Poehnelt, Justin; Yadav, Kamini; Rao, Mahesh N.; Massey, Richard

    2017-01-01

    Mapping croplands, including fallow areas, are an important measure to determine the quantity of food that is produced, where they are produced, and when they are produced (e.g. seasonality). Furthermore, croplands are known as water guzzlers by consuming anywhere between 70% and 90% of all human water use globally. Given these facts and the increase in global population to nearly 10 billion by the year 2050, the need for routine, rapid, and automated cropland mapping year-after-year and/or season-after-season is of great importance. The overarching goal of this study was to generate standard and routine cropland products, year-after-year, over very large areas through the use of two novel methods: (a) quantitative spectral matching techniques (QSMTs) applied at continental level and (b) rule-based Automated Cropland Classification Algorithm (ACCA) with the ability to hind-cast, now-cast, and future-cast. Australia was chosen for the study given its extensive croplands, rich history of agriculture, and yet nonexistent routine yearly generated cropland products using multi-temporal remote sensing. This research produced three distinct cropland products using Moderate Resolution Imaging Spectroradiometer (MODIS) 250-m normalized difference vegetation index 16-day composite time-series data for 16 years: 2000 through 2015. The products consisted of: (1) cropland extent/areas versus cropland fallow areas, (2) irrigated versus rainfed croplands, and (3) cropping intensities: single, double, and continuous cropping. An accurate reference cropland product (RCP) for the year 2014 (RCP2014) produced using QSMT was used as a knowledge base to train and develop the ACCA algorithm that was then applied to the MODIS time-series data for the years 2000–2015. A comparison between the ACCA-derived cropland products (ACPs) for the year 2014 (ACP2014) versus RCP2014 provided an overall agreement of 89.4% (kappa = 0.814) with six classes: (a) producer’s accuracies varying

  12. Automated fault extraction and classification using 3-D seismic data for the Ekofisk field development

    Energy Technology Data Exchange (ETDEWEB)

    Signer, C.; Nickel, M.; Randen, T.; Saeter, T.; Soenneland, H.H.

    1998-12-31

    Mapping of fractures is important for the prediction of fluid flow in many reservoir types. The fluid flow depends mainly on the efficiency of the reservoir seals. Improved spatial mapping of the open and closed fracture systems will allow a better prediction of the fluid flow pattern. The primary objectives of this paper is to present fracture characterization at the reservoir scale combined with seismic facies mapping. The complexity of the giant Ekofisk field on the Norwegian continental shelf provides an ideal framework for testing the validity and the applicability of an automated seismic fault and fracture detection and mapping tool. The mapping of the faults can be based on seismic attribute grids, which means that attribute-responses related to faults are extracted along key horizons which were interpreted in the reservoir interval. 3 refs., 3 figs.

  13. Walk and learn: an empirical framework for assessing spatial knowledge acquisition during mobile map use

    OpenAIRE

    Brügger, Annina; Richter, Kai-Florian; Fabrikant, Sara I

    2016-01-01

    We gladly use automated technology (e.g., smart devices) to extend our hard working minds. But what if such technology turns into mind crutches we cannot do without? Understanding how varying levels of automation in mobile maps might impact navigation performance and spatial knowledge acquisition will provide important insights for the ongoing debate on the potentially detrimental effects of using navigation systems on human spatial cognition. We need to identify the right balance between sys...

  14. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  15. Automated Fresnel lens tester system

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, G.S.

    1981-07-01

    An automated data collection system controlled by a desktop computer has been developed for testing Fresnel concentrators (lenses) intended for solar energy applications. The system maps the two-dimensional irradiance pattern (image) formed in a plane parallel to the lens, whereas the lens and detector assembly track the sun. A point detector silicon diode (0.5-mm-dia active area) measures the irradiance at each point of an operator-defined rectilinear grid of data positions. Comparison with a second detector measuring solar insolation levels results in solar concentration ratios over the image plane. Summation of image plane energies allows calculation of lens efficiencies for various solar cell sizes. Various graphical plots of concentration ratio data help to visualize energy distribution patterns.

  16. Automated detection and mapping of crown discolouration caused by jack pine budworm with 2.5 m resolution multispectral imagery

    Science.gov (United States)

    Leckie, Donald G.; Cloney, Ed; Joyce, Steve P.

    2005-05-01

    Jack pine budworm ( Choristoneura pinus pinus (Free.)) is a native insect defoliator of mainly jack pine ( Pinus banksiana Lamb.) in North America east of the Rocky Mountains. Periodic outbreaks of this insect, which generally last two to three years, can cause growth loss and mortality and have an important impact ecologically and economically in terms of timber production and harvest. The jack pine budworm prefers to feed on current year needles. Their characteristic feeding habits cause discolouration or reddening of the canopy. This red colouration is used to map the distribution and intensity of defoliation that has taken place that year (current defoliation). An accurate and consistent map of the distribution and intensity of budworm defoliation (as represented by the red discolouration) at the stand and within stand level is desirable. Automated classification of multispectral imagery, such as is available from airborne and new high resolution satellite systems, was explored as a viable tool for objectively classifying current discolouration. Airborne multispectral imagery was acquired at a 2.5 m resolution with the Multispectral Electro-optical Imaging Sensor (MEIS). It recorded imagery in six nadir looking spectral bands specifically designed to detect discolouration caused by budworm and a near-infrared band viewing forward at 35° was also used. A 2200 nm middle infrared image was acquired with a Daedalus scanner. Training and test areas of different levels of discolouration were created based on field observations and a maximum likelihood supervized classification was used to estimate four classes of discolouration (nil-trace, light, moderate and severe). Good discrimination was achieved with an overall accuracy of 84% for the four discolouration levels. The moderate discolouration class was the poorest at 73%, because of confusion with both the severe and light classes. Accuracy on a stand basis was also good, and regional and within stand

  17. A computational framework for ultrastructural mapping of neural circuitry.

    Directory of Open Access Journals (Sweden)

    James R Anderson

    2009-03-01

    Full Text Available Circuitry mapping of metazoan neural systems is difficult because canonical neural regions (regions containing one or more copies of all components are large, regional borders are uncertain, neuronal diversity is high, and potential network topologies so numerous that only anatomical ground truth can resolve them. Complete mapping of a specific network requires synaptic resolution, canonical region coverage, and robust neuronal classification. Though transmission electron microscopy (TEM remains the optimal tool for network mapping, the process of building large serial section TEM (ssTEM image volumes is rendered difficult by the need to precisely mosaic distorted image tiles and register distorted mosaics. Moreover, most molecular neuronal class markers are poorly compatible with optimal TEM imaging. Our objective was to build a complete framework for ultrastructural circuitry mapping. This framework combines strong TEM-compliant small molecule profiling with automated image tile mosaicking, automated slice-to-slice image registration, and gigabyte-scale image browsing for volume annotation. Specifically we show how ultrathin molecular profiling datasets and their resultant classification maps can be embedded into ssTEM datasets and how scripted acquisition tools (SerialEM, mosaicking and registration (ir-tools, and large slice viewers (MosaicBuilder, Viking can be used to manage terabyte-scale volumes. These methods enable large-scale connectivity analyses of new and legacy data. In well-posed tasks (e.g., complete network mapping in retina, terabyte-scale image volumes that previously would require decades of assembly can now be completed in months. Perhaps more importantly, the fusion of molecular profiling, image acquisition by SerialEM, ir-tools volume assembly, and data viewers/annotators also allow ssTEM to be used as a prospective tool for discovery in nonneural systems and a practical screening methodology for neurogenetics. Finally

  18. Automating the SMAP Ground Data System to Support Lights-Out Operations

    Science.gov (United States)

    Sanders, Antonio

    2014-01-01

    The Soil Moisture Active Passive (SMAP) Mission is a first tier mission in NASA's Earth Science Decadal Survey. SMAP will provide a global mapping of soil moisture and its freeze/thaw states. This mapping will be used to enhance the understanding of processes that link the terrestrial water, energy, and carbon cycles, and to enhance weather and forecast capabilities. NASA's Jet Propulsion Laboratory has been selected as the lead center for the development and operation of SMAP. The Jet Propulsion Laboratory (JPL) has an extensive history of successful deep space exploration. JPL missions have typically been large scale Class A missions with significant budget and staffing. SMAP represents a new area of JPL focus towards low cost Earth science missions. Success in this new area requires changes to the way that JPL has traditionally provided the Mission Operations System (MOS)/Ground Data System (GDS) functions. The operation of SMAP requires more routine operations activities and support for higher data rates and data volumes than have been achieved in the past. These activities must be addressed by a reduced operations team and support staff. To meet this challenge, the SMAP ground data system provides automation that will perform unattended operations, including automated commanding of the SMAP spacecraft.

  19. National Seabed Mapping Programmes Collaborate to Advance Marine Geomorphological Mapping in Adjoining European Seas

    Science.gov (United States)

    Monteys, X.; Guinan, J.; Green, S.; Gafeira, J.; Dove, D.; Baeten, N. J.; Thorsnes, T.

    2017-12-01

    Marine geomorphological mapping is an effective means of characterising and understanding the seabed and its features with direct relevance to; offshore infrastructure placement, benthic habitat mapping, conservation & policy, marine spatial planning, fisheries management and pure research. Advancements in acoustic survey techniques and data processing methods resulting in the availability of high-resolution marine datasets e.g. multibeam echosounder bathymetry and shallow seismic mean that geological interpretations can be greatly improved by combining with geomorphological maps. Since December 2015, representatives from the national seabed mapping programmes of Norway (MAREANO), Ireland (INFOMAR) and the United Kingdom (MAREMAP) have collaborated and established the MIM geomorphology working group) with the common aim of advancing best practice for geological mapping in their adjoining sea areas in north-west Europe. A recently developed two-part classification system for Seabed Geomorphology (`Morphology' and Geomorphology') has been established as a result of an initiative led by the British Geological Survey (BGS) with contributions from the MIM group (Dove et al. 2016). To support the scheme, existing BGS GIS tools (SIGMA) have been adapted to apply this two-part classification system and here we present on the tools effectiveness in mapping geomorphological features, along with progress in harmonising the classification and feature nomenclature. Recognising that manual mapping of seabed features can be time-consuming and subjective, semi-automated approaches for mapping seabed features and improving mapping efficiency is being developed using Arc-GIS based tools. These methods recognise, spatially delineate and morphologically describe seabed features such as pockmarks (Gafeira et al., 2012) and cold-water coral mounds. Such tools utilise multibeam echosounder data or any other bathymetric dataset (e.g. 3D seismic, Geldof et al., 2014) that can produce a

  20. Semi-automated landform classification for hazard mapping of soil liquefaction by earthquake

    Science.gov (United States)

    Nakano, Takayuki

    2018-05-01

    Soil liquefaction damages were caused by huge earthquake in Japan, and the similar damages are concerned in near future huge earthquake. On the other hand, a preparation of soil liquefaction risk map (soil liquefaction hazard map) is impeded by the difficulty of evaluation of soil liquefaction risk. Generally, relative soil liquefaction risk should be able to be evaluated from landform classification data by using experimental rule based on the relationship between extent of soil liquefaction damage and landform classification items associated with past earthquake. Therefore, I rearranged the relationship between landform classification items and soil liquefaction risk intelligibly in order to enable the evaluation of soil liquefaction risk based on landform classification data appropriately and efficiently. And I developed a new method of generating landform classification data of 50-m grid size from existing landform classification data of 250-m grid size by using digital elevation model (DEM) data and multi-band satellite image data in order to evaluate soil liquefaction risk in detail spatially. It is expected that the products of this study contribute to efficient producing of soil liquefaction hazard map by local government.

  1. A fast, automated, semideterministic weight windows generator for MCNP

    International Nuclear Information System (INIS)

    Mickael, M.W.

    1995-01-01

    A fast automated method is developed to estimate particle importance in the Los Alamos Carlo code MCNP. It provides an automated and efficient way of predicting and setting up an important map for the weight windows technique. A short analog simulation is first performed to obtain effective group parameters based on the input description of the problem. A solution of the multigroup time-dependent adjoint diffusion equation is then used to estimate particle importance. At any point in space, time, and energy, the particle importance is determined, based on the calculated parameters, and used as the lower limit of the weight window. The method has been tested for neutron, photon, and coupled neutron-photon problems. Significant improvement in the simulation efficiency is obtained using this technique at no additional computer time and with no prior knowledge of the nature of the problem. Moreover, time and angular importance that are not available yet in MCNP are easily implemented in this method

  2. a Model Study of Small-Scale World Map Generalization

    Science.gov (United States)

    Cheng, Y.; Yin, Y.; Li, C. M.; Wu, W.; Guo, P. P.; Ma, X. L.; Hu, F. M.

    2018-04-01

    With the globalization and rapid development every filed is taking an increasing interest in physical geography and human economics. There is a surging demand for small scale world map in large formats all over the world. Further study of automated mapping technology, especially the realization of small scale production on a large scale global map, is the key of the cartographic field need to solve. In light of this, this paper adopts the improved model (with the map and data separated) in the field of the mapmaking generalization, which can separate geographic data from mapping data from maps, mainly including cross-platform symbols and automatic map-making knowledge engine. With respect to the cross-platform symbol library, the symbol and the physical symbol in the geographic information are configured at all scale levels. With respect to automatic map-making knowledge engine consists 97 types, 1086 subtypes, 21845 basic algorithm and over 2500 relevant functional modules.In order to evaluate the accuracy and visual effect of our model towards topographic maps and thematic maps, we take the world map generalization in small scale as an example. After mapping generalization process, combining and simplifying the scattered islands make the map more explicit at 1 : 2.1 billion scale, and the map features more complete and accurate. Not only it enhance the map generalization of various scales significantly, but achieve the integration among map-makings of various scales, suggesting that this model provide a reference in cartographic generalization for various scales.

  3. Precise Positioning of Uavs - Dealing with Challenging Rtk-Gps Measurement Conditions during Automated Uav Flights

    Science.gov (United States)

    Zimmermann, F.; Eling, C.; Klingbeil, L.; Kuhlmann, H.

    2017-08-01

    For some years now, UAVs (unmanned aerial vehicles) are commonly used for different mobile mapping applications, such as in the fields of surveying, mining or archeology. To improve the efficiency of these applications an automation of the flight as well as the processing of the collected data is currently aimed at. One precondition for an automated mapping with UAVs is that the georeferencing is performed directly with cm-accuracies or better. Usually, a cm-accurate direct positioning of UAVs is based on an onboard multi-sensor system, which consists of an RTK-capable (real-time kinematic) GPS (global positioning system) receiver and additional sensors (e.g. inertial sensors). In this case, the absolute positioning accuracy essentially depends on the local GPS measurement conditions. Especially during mobile mapping applications in urban areas, these conditions can be very challenging, due to a satellite shadowing, non-line-of sight receptions, signal diffraction or multipath effects. In this paper, two straightforward and easy to implement strategies will be described and analyzed, which improve the direct positioning accuracies for UAV-based mapping and surveying applications under challenging GPS measurement conditions. Based on a 3D model of the surrounding buildings and vegetation in the area of interest, a GPS geometry map is determined, which can be integrated in the flight planning process, to avoid GPS challenging environments as far as possible. If these challenging environments cannot be avoided, the GPS positioning solution is improved by using obstruction adaptive elevation masks, to mitigate systematic GPS errors in the RTK-GPS positioning. Simulations and results of field tests demonstrate the profit of both strategies.

  4. PRECISE ORTHO IMAGERY AS THE SOURCE FOR AUTHORITATIVE AIRPORT MAPPING

    Directory of Open Access Journals (Sweden)

    H. Howard

    2016-06-01

    Full Text Available As the aviation industry moves from paper maps and charts to the digital cockpit and electronic flight bag, producers of these products need current and accurate data to ensure flight safety. FAA (Federal Aviation Administration and ICAO (International Civil Aviation Organization require certified suppliers to follow a defined protocol to produce authoritative map data for the aerodrome. Typical airport maps have been produced to meet 5 m accuracy requirements. The new digital aviation world is moving to 1 m accuracy maps to provide better situational awareness on the aerodrome. The commercial availability of 0.5 m satellite imagery combined with accurate ground control is enabling the production of avionics certified .85 m orthophotos of airports around the globe. CompassData maintains an archive of over 400+ airports as source data to support producers of 1 m certified Aerodrome Mapping Database (AMDB critical to flight safety and automated situational awareness. CompassData is a DO200A certified supplier of authoritative orthoimagery and attendees will learn how to utilize current airport imagery to build digital aviation mapping products.

  5. Precise Ortho Imagery as the Source for Authoritative Airport Mapping

    Science.gov (United States)

    Howard, H.; Hummel, P.

    2016-06-01

    As the aviation industry moves from paper maps and charts to the digital cockpit and electronic flight bag, producers of these products need current and accurate data to ensure flight safety. FAA (Federal Aviation Administration) and ICAO (International Civil Aviation Organization) require certified suppliers to follow a defined protocol to produce authoritative map data for the aerodrome. Typical airport maps have been produced to meet 5 m accuracy requirements. The new digital aviation world is moving to 1 m accuracy maps to provide better situational awareness on the aerodrome. The commercial availability of 0.5 m satellite imagery combined with accurate ground control is enabling the production of avionics certified .85 m orthophotos of airports around the globe. CompassData maintains an archive of over 400+ airports as source data to support producers of 1 m certified Aerodrome Mapping Database (AMDB) critical to flight safety and automated situational awareness. CompassData is a DO200A certified supplier of authoritative orthoimagery and attendees will learn how to utilize current airport imagery to build digital aviation mapping products.

  6. Estimated Depth Maps of the Northwestern Hawaiian Islands Derived from High Resolution IKONOS Satellite Imagery (Draft)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Estimated shallow-water, depth maps were produced using rule-based, semi-automated image analysis of high-resolution satellite imagery for nine locations in the...

  7. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  8. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  9. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    Idris, N H; Jackson, M J; Ishak, M H I

    2014-01-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  10. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  11. Datafication of Automated (Legal) Decisions - or how (not) to install a GPS when law is not precisely a map

    DEFF Research Database (Denmark)

    Schaumburg-Müller, Sten

    situations to closed situations with a vast, but technically manageable amount of fixed data – driving cars may be a good example – it may be counterproductive to reduce all situations to categorizationable and foreseeable ones. This automation skepticism hinges on various concepts such as ‘tychism’ (Peirce......Even though I maintain that it is a misconception to state that states are “no longer” the only actors, since they never were, indeed it makes sense to “shed light on the impact of (…) new tendencies on legal regulatory mechanisms (…)” One regulatory tendency is obviously the automation of (legal......) decisions which has implications for legal orders, legal actors and legal research, not to mention legal legitimacy as well as personal autonomy and democracy. On the one hand automation may facilitate better, faster, more predictable and more coherent decisions and leave cumbersome and time consuming...

  12. SU-F-J-93: Automated Segmentation of High-Resolution 3D WholeBrain Spectroscopic MRI for Glioblastoma Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Schreibmann, E; Shu, H [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, GA (United States); Cordova, J; Gurbani, S; Holder, C; Cooper, L; Shim, H [Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA (United States)

    2016-06-15

    Purpose: We report on an automated segmentation algorithm for defining radiation therapy target volumes using spectroscopic MR images (sMRI) acquired at nominal voxel resolution of 100 microliters. Methods: Wholebrain sMRI combining 3D echo-planar spectroscopic imaging, generalized auto-calibrating partially-parallel acquisitions, and elliptical k-space encoding were conducted on 3T MRI scanner with 32-channel head coil array creating images. Metabolite maps generated include choline (Cho), creatine (Cr), and N-acetylaspartate (NAA), as well as Cho/NAA, Cho/Cr, and NAA/Cr ratio maps. Automated segmentation was achieved by concomitantly considering sMRI metabolite maps with standard contrast enhancing (CE) imaging in a pipeline that first uses the water signal for skull stripping. Subsequently, an initial blob of tumor region is identified by searching for regions of FLAIR abnormalities that also display reduced NAA activity using a mean ratio correlation and morphological filters. These regions are used as starting point for a geodesic level-set refinement that adapts the initial blob to the fine details specific to each metabolite. Results: Accuracy of the segmentation model was tested on a cohort of 12 patients that had sMRI datasets acquired pre, mid and post-treatment, providing a broad range of enhancement patterns. Compared to classical imaging, where heterogeneity in the tumor appearance and shape across posed a greater challenge to the algorithm, sMRI’s regions of abnormal activity were easily detected in the sMRI metabolite maps when combining the detail available in the standard imaging with the local enhancement produced by the metabolites. Results can be imported in the treatment planning, leading in general increase in the target volumes (GTV60) when using sMRI+CE MRI compared to the standard CE MRI alone. Conclusion: Integration of automated segmentation of sMRI metabolite maps into planning is feasible and will likely streamline acceptance of this

  13. Mapping query terms to data and schema using content based similarity search in clinical information systems.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2013-01-01

    This paper reports on the issues in mapping the terms of a query to the field names of the schema of an Entity Relationship (ER) model or to the data part of the Entity Attribute Value (EAV) model using similarity based Top-K algorithm in clinical information system together with an extension of EAV mapping for medication names. In addition, the details of the mapping algorithm and the required pre-processing including NLP (Natural Language Processing) tasks to prepare resources for mapping are explained. The experimental results on an example clinical information system demonstrate more than 84 per cent of accuracy in mapping. The results will be integrated into our proposed Clinical Data Analytics Language (CliniDAL) to automate mapping process in CliniDAL.

  14. PRECISE POSITIONING OF UAVS – DEALING WITH CHALLENGING RTK-GPS MEASUREMENT CONDITIONS DURING AUTOMATED UAV FLIGHTS

    Directory of Open Access Journals (Sweden)

    F. Zimmermann

    2017-08-01

    Full Text Available For some years now, UAVs (unmanned aerial vehicles are commonly used for different mobile mapping applications, such as in the fields of surveying, mining or archeology. To improve the efficiency of these applications an automation of the flight as well as the processing of the collected data is currently aimed at. One precondition for an automated mapping with UAVs is that the georeferencing is performed directly with cm-accuracies or better. Usually, a cm-accurate direct positioning of UAVs is based on an onboard multi-sensor system, which consists of an RTK-capable (real-time kinematic GPS (global positioning system receiver and additional sensors (e.g. inertial sensors. In this case, the absolute positioning accuracy essentially depends on the local GPS measurement conditions. Especially during mobile mapping applications in urban areas, these conditions can be very challenging, due to a satellite shadowing, non-line-of sight receptions, signal diffraction or multipath effects. In this paper, two straightforward and easy to implement strategies will be described and analyzed, which improve the direct positioning accuracies for UAV-based mapping and surveying applications under challenging GPS measurement conditions. Based on a 3D model of the surrounding buildings and vegetation in the area of interest, a GPS geometry map is determined, which can be integrated in the flight planning process, to avoid GPS challenging environments as far as possible. If these challenging environments cannot be avoided, the GPS positioning solution is improved by using obstruction adaptive elevation masks, to mitigate systematic GPS errors in the RTK-GPS positioning. Simulations and results of field tests demonstrate the profit of both strategies.

  15. Feasibility of a semi-automated method for cardiac conduction velocity analysis of high-resolution activation maps

    NARCIS (Netherlands)

    Doshi, Ashish N.; Walton, Richard D.; Krul, Sébastien P.; de Groot, Joris R.; Bernus, Olivier; Efimov, Igor R.; Boukens, Bastiaan J.; Coronel, Ruben

    2015-01-01

    Myocardial conduction velocity is important for the genesis of arrhythmias. In the normal heart, conduction is primarily dependent on fiber direction (anisotropy) and may be discontinuous at sites with tissue heterogeneities (trabeculated or fibrotic tissue). We present a semi-automated method for

  16. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  17. Automated delineation of stroke lesions using brain CT images

    Directory of Open Access Journals (Sweden)

    Céline R. Gillebert

    2014-01-01

    Full Text Available Computed tomographic (CT images are widely used for the identification of abnormal brain tissue following infarct and hemorrhage in stroke. Manual lesion delineation is currently the standard approach, but is both time-consuming and operator-dependent. To address these issues, we present a method that can automatically delineate infarct and hemorrhage in stroke CT images. The key elements of this method are the accurate normalization of CT images from stroke patients into template space and the subsequent voxelwise comparison with a group of control CT images for defining areas with hypo- or hyper-intense signals. Our validation, using simulated and actual lesions, shows that our approach is effective in reconstructing lesions resulting from both infarct and hemorrhage and yields lesion maps spatially consistent with those produced manually by expert operators. A limitation is that, relative to manual delineation, there is reduced sensitivity of the automated method in regions close to the ventricles and the brain contours. However, the automated method presents a number of benefits in terms of offering significant time savings and the elimination of the inter-operator differences inherent to manual tracing approaches. These factors are relevant for the creation of large-scale lesion databases for neuropsychological research. The automated delineation of stroke lesions from CT scans may also enable longitudinal studies to quantify changes in damaged tissue in an objective and reproducible manner.

  18. Automating Hyperspectral Data for Rapid Response in Volcanic Emergencies

    Science.gov (United States)

    Davies, Ashley G.; Doubleday, Joshua R.; Chien, Steve A.

    2013-01-01

    In a volcanic emergency, time is of the essence. It is vital to quantify eruption parameters (thermal emission, effusion rate, location of activity) and distribute this information as quickly as possible to decision-makers in order to enable effective evaluation of eruption-related risk and hazard. The goal of this work was to automate and streamline processing of spacecraft hyperspectral data, automate product generation, and automate distribution of products. Visible and Short-Wave Infrared Images of volcanic eruption in Iceland in May 2010." class="caption" align="right">The software rapidly processes hyperspectral data, correcting for incident sunlight where necessary, and atmospheric transmission; detects thermally anomalous pixels; fits data with model black-body thermal emission spectra to determine radiant flux; calculates atmospheric convection thermal removal; and then calculates total heat loss. From these results, an estimation of effusion rate is made. Maps are generated of thermal emission and location (see figure). Products are posted online, and relevant parties notified. Effusion rate data are added to historical record and plotted to identify spikes in activity for persistently active eruptions. The entire process from start to end is autonomous. Future spacecraft, especially those in deep space, can react to detection of transient processes without the need to communicate with Earth, thus increasing science return. Terrestrially, this removes the need for human intervention.

  19. Towards data integration automation for the French rare disease registry.

    Science.gov (United States)

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types.

  20. Generation and communication of dynamic maps using light projection

    Science.gov (United States)

    Busch, Steffen; Schlichting, Alexander; Brenner, Claus

    2018-05-01

    Many accidents are caused by miscommunication between traffic participants. Much research is being conducted in the area of car to car and car to infrastructure communication in order to eliminate this cause of accidents. How-ever, less attention is paid to the question how the behavior of a car can be communicated to pedestrians. Especially considering automated traffic, there is a lack of communication between cars and pedestrians. In this paper, we address the question how an autonomously driving car can inform pedestrians about its intentions. Especially in case of highly automated driving, making eye contact with a driver will give no clue about his or her intensions. We developed a prototype which continuously informs pedestrians about the intentions of the vehicle by projecting visual patterns onto the ground. Furthermore, the system communicates its interpretation of the observed situation to the pedestrians to warn them or to encourage them to perform a certain action. In order to communicate adaptively, the vehicle needs to develop an understanding of the dynamics of a city to know what to expect in certain situations and what speed is appropriate. To support this, we created a dynamic map, which estimates the number of pedestrians and cyclists in a certain area, which is then used to determine how `hazardous' the area is. This dynamic map is obtained from measurement data from many time instances, in contrast to the static car navigation maps, which are prevalent today. Apart from being used for communication purposes, the dynamic map can also influence the speed of a car, be it manually or autonomously driven. Adapting the speed in hazardous areas will avoid accidents where a car drives too fast, so that neither a human nor a computer-operated system would be able to stop in time.

  1. Generation of Land Cover Maps Using High-Resolution Multispectral Aerial Cameras

    DEFF Research Database (Denmark)

    Höhle, Joachim

    2013-01-01

    . The classification had an overall accuracy of 79%. Suggestions for the improvements in the applied methodology are made. The potential of land cover maps lies in updating of topographic databases, quality control of maps, studies of town development, and other geo-spatial domain applications. The automatic...... for classification of land cover. A high degree of automation can be achieved. The obtained results of a practical example are checked with reference values derived from ortho-images in natural colour and from colour images using stereo-vision. An error matrix is applied in the evaluation of the results...

  2. Placement suitability criteria of composite tape for mould surface in automated tape placement

    Directory of Open Access Journals (Sweden)

    Zhang Peng

    2015-10-01

    Full Text Available Automated tape placement is an important automated process used for fabrication of large composite structures in aeronautical industry. The carbon fiber composite parts realized with this process tend to replace the aluminum parts produced by high-speed machining. It is difficult to determine the appropriate width of the composite tape in automated tape placement. Wrinkling will appear in the tape if it does not suit for the mould surface. Thus, this paper deals with establishing placement suitability criteria of the composite tape for the mould surface. With the assumptions for ideal mapping and by applying some principles and theorems of differential geometry, the centerline trajectory of the composite tape is identified to follow the geodesic. The placement suitability of the composite tape is examined on three different types of non-developable mould surfaces and four criteria are derived. The developed criteria have been used to test the deposition process over several mould surfaces and the appropriate width for each mould surface is obtained by referring to these criteria.

  3. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  4. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  5. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  6. Intraoperative Subcortical Electrical Mapping of the Optic Tract in Awake Surgery Using a Virtual Reality Headset.

    Science.gov (United States)

    Mazerand, Edouard; Le Renard, Marc; Hue, Sophie; Lemée, Jean-Michel; Klinger, Evelyne; Menei, Philippe

    2017-01-01

    Brain mapping during awake craniotomy is a well-known technique to preserve neurological functions, especially the language. It is still challenging to map the optic radiations due to the difficulty to test the visual field intraoperatively. To assess the visual field during awake craniotomy, we developed the Functions' Explorer based on a virtual reality headset (FEX-VRH). The impaired visual field of 10 patients was tested with automated perimetry (the gold standard examination) and the FEX-VRH. The proof-of-concept test was done during the surgery performed on a patient who was blind in his right eye and presenting with a left parietotemporal glioblastoma. The FEX-VRH was used intraoperatively, simultaneously with direct subcortical electrostimulation, allowing identification and preservation of the optic radiations. The FEX-VRH detected 9 of the 10 visual field defects found by automated perimetry. The patient who underwent an awake craniotomy with intraoperative mapping of the optic tract using the FEX-VRH had no permanent postoperative visual field defect. Intraoperative visual field assessment with the FEX-VRH during direct subcortical electrostimulation is a promising approach to mapping the optical radiations and preventing a permanent visual field defect during awake surgery for epilepsy or tumor. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Rapid Land Cover Map Updates Using Change Detection and Robust Random Forest Classifiers

    Directory of Open Access Journals (Sweden)

    Konrad J. Wessels

    2016-10-01

    Full Text Available The paper evaluated the Landsat Automated Land Cover Update Mapping (LALCUM system designed to rapidly update a land cover map to a desired nominal year using a pre-existing reference land cover map. The system uses the Iteratively Reweighted Multivariate Alteration Detection (IRMAD to identify areas of change and no change. The system then automatically generates large amounts of training samples (n > 1 million in the no-change areas as input to an optimized Random Forest classifier. Experiments were conducted in the KwaZulu-Natal Province of South Africa using a reference land cover map from 2008, a change mask between 2008 and 2011 and Landsat ETM+ data for 2011. The entire system took 9.5 h to process. We expected that the use of the change mask would improve classification accuracy by reducing the number of mislabeled training data caused by land cover change between 2008 and 2011. However, this was not the case due to exceptional robustness of Random Forest classifier to mislabeled training samples. The system achieved an overall accuracy of 65%–67% using 22 detailed classes and 72%–74% using 12 aggregated national classes. “Water”, “Plantations”, “Plantations—clearfelled”, “Orchards—trees”, “Sugarcane”, “Built-up/dense settlement”, “Cultivation—Irrigated” and “Forest (indigenous” had user’s accuracies above 70%. Other detailed classes (e.g., “Low density settlements”, “Mines and Quarries”, and “Cultivation, subsistence, drylands” which are required for operational, provincial-scale land use planning and are usually mapped using manual image interpretation, could not be mapped using Landsat spectral data alone. However, the system was able to map the 12 national classes, at a sufficiently high level of accuracy for national scale land cover monitoring. This update approach and the highly automated, scalable LALCUM system can improve the efficiency and update rate of regional land

  8. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  9. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  10. Forest Cover Mapping in Iskandar Malaysia Using Satellite Data

    Science.gov (United States)

    Kanniah, K. D.; Mohd Najib, N. E.; Vu, T. T.

    2016-09-01

    Malaysia is the third largest country in the world that had lost forest cover. Therefore, timely information on forest cover is required to help the government to ensure that the remaining forest resources are managed in a sustainable manner. This study aims to map and detect changes of forest cover (deforestation and disturbance) in Iskandar Malaysia region in the south of Peninsular Malaysia between years 1990 and 2010 using Landsat satellite images. The Carnegie Landsat Analysis System-Lite (CLASlite) programme was used to classify forest cover using Landsat images. This software is able to mask out clouds, cloud shadows, terrain shadows, and water bodies and atmospherically correct the images using 6S radiative transfer model. An Automated Monte Carlo Unmixing technique embedded in CLASlite was used to unmix each Landsat pixel into fractions of photosynthetic vegetation (PV), non photosynthetic vegetation (NPV) and soil surface (S). Forest and non-forest areas were produced from the fractional cover images using appropriate threshold values of PV, NPV and S. CLASlite software was found to be able to classify forest cover in Iskandar Malaysia with only a difference between 14% (1990) and 5% (2010) compared to the forest land use map produced by the Department of Agriculture, Malaysia. Nevertheless, the CLASlite automated software used in this study was found not to exclude other vegetation types especially rubber and oil palm that has similar reflectance to forest. Currently rubber and oil palm were discriminated from forest manually using land use maps. Therefore, CLASlite algorithm needs further adjustment to exclude these vegetation and classify only forest cover.

  11. Low-cost automated system for phase-shifting and phase retrieval based on the tunability of a laser diode

    Science.gov (United States)

    Rivera-Ortega, Uriel; Dirckx, Joris

    2016-09-01

    A low-cost and fully automated process for phase-shifting interferometry by continuously changing and turning on-off the input voltage of a laser diode under the scheme of an unbalanced Twyman-Green interferometer setup is presented. The input signal of a laser diode is controlled by a Data Acquisition (NI-DAQ) device which permits to change its wavelength according to its tunability features. The automation and data analysis will be done using LabVIEW in combination with MATLAB. By using Carré algorithm the phase map is obtained. Measurements of visibility and phase-shift to verify the PSI requirements are also shown.

  12. Combining ambiguous chemical shift mapping with structure-based backbone and NOE assignment from 15N-NOESY

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2011-01-01

    to these difficulties, the mapping is typically done manually or semi-automatically. However, automated methods are necessary for high-throughput drug screening. We present PeakWalker, a novel peak walking algorithm for fast-exchange systems that models the errors

  13. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  14. Can automation in radiotherapy reduce costs?

    Science.gov (United States)

    Massaccesi, Mariangela; Corti, Michele; Azario, Luigi; Balducci, Mario; Ferro, Milena; Mantini, Giovanna; Mattiucci, Gian Carlo; Valentini, Vincenzo

    2015-01-01

    Computerized automation is likely to play an increasingly important role in radiotherapy. The objective of this study was to report the results of the first part of a program to implement a model for economical evaluation based on micro-costing method. To test the efficacy of the model, the financial impact of the introduction of an automation tool was estimated. A single- and multi-center validation of the model by a prospective collection of data is planned as the second step of the program. The model was implemented by using an interactive spreadsheet (Microsoft Excel, 2010). The variables to be included were identified across three components: productivity, staff, and equipment. To calculate staff requirements, the workflow of Gemelli ART center was mapped out and relevant workload measures were defined. Profit and loss, productivity and staffing were identified as significant outcomes. Results were presented in terms of earnings before interest and taxes (EBIT). Three different scenarios were hypothesized: baseline situation at Gemelli ART (scenario 1); reduction by 2 minutes of the average duration of treatment fractions (scenario 2); and increased incidence of advanced treatment modalities (scenario 3). By using the model, predicted EBIT values for each scenario were calculated across a period of eight years (from 2015 to 2022). For both scenarios 2 and 3 costs are expected to slightly increase as compared to baseline situation that is particularly due to a little increase in clinical personnel costs. However, in both cases EBIT values are more favorable than baseline situation (EBIT values: scenario 1, 27%, scenario 2, 30%, scenario 3, 28% of revenues). A model based on a micro-costing method was able to estimate the financial consequences of the introduction of an automation tool in our radiotherapy department. A prospective collection of data at Gemelli ART and in a consortium of centers is currently under way to prospectively validate the model.

  15. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  16. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  17. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C α positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  18. A semi-automated algorithm for hypothalamus volumetry in 3 Tesla magnetic resonance images.

    Science.gov (United States)

    Wolff, Julia; Schindler, Stephanie; Lucas, Christian; Binninger, Anne-Sophie; Weinrich, Luise; Schreiber, Jan; Hegerl, Ulrich; Möller, Harald E; Leitzke, Marco; Geyer, Stefan; Schönknecht, Peter

    2018-07-30

    The hypothalamus, a small diencephalic gray matter structure, is part of the limbic system. Volumetric changes of this structure occur in psychiatric diseases, therefore there is increasing interest in precise volumetry. Based on our detailed volumetry algorithm for 7 Tesla magnetic resonance imaging (MRI), we developed a method for 3 Tesla MRI, adopting anatomical landmarks and work in triplanar view. We overlaid T1-weighted MR images with gray matter-tissue probability maps to combine anatomical information with tissue class segmentation. Then, we outlined regions of interest (ROIs) that covered potential hypothalamus voxels. Within these ROIs, seed growing technique helped define the hypothalamic volume using gray matter probabilities from the tissue probability maps. This yielded a semi-automated method with short processing times of 20-40 min per hypothalamus. In the MRIs of ten subjects, reliabilities were determined as intraclass correlations (ICC) and volume overlaps in percent. Three raters achieved very good intra-rater reliabilities (ICC 0.82-0.97) and good inter-rater reliabilities (ICC 0.78 and 0.82). Overlaps of intra- and inter-rater runs were very good (≥ 89.7%). We present a fast, semi-automated method for in vivo hypothalamus volumetry in 3 Tesla MRI. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. High Throughput T Epitope Mapping and Vaccine Development

    Directory of Open Access Journals (Sweden)

    Giuseppina Li Pira

    2010-01-01

    Full Text Available Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th and by cytolytic T lymphocytes (CTL is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost.

  20. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  1. Automated image segmentation using information theory

    International Nuclear Information System (INIS)

    Hibbard, L.S.

    2001-01-01

    Full text: Our development of automated contouring of CT images for RT planning is based on maximum a posteriori (MAP) analyses of region textures, edges, and prior shapes, and assumes stationary Gaussian distributions for voxel textures and contour shapes. Since models may not accurately represent image data, it would be advantageous to compute inferences without relying on models. The relative entropy (RE) from information theory can generate inferences based solely on the similarity of probability distributions. The entropy of a distribution of a random variable X is defined as -Σ x p(x)log 2 p(x) for all the values x which X may assume. The RE (Kullback-Liebler divergence) of two distributions p(X), q(X), over X is Σ x p(x)log 2 {p(x)/q(x)}. The RE is a kind of 'distance' between p,q, equaling zero when p=q and increasing as p,q are more different. Minimum-error MAP and likelihood ratio decision rules have RE equivalents: minimum error decisions obtain with functions of the differences between REs of compared distributions. One applied result is the contour ideally separating two regions is that which maximizes the relative entropy of the two regions' intensities. A program was developed that automatically contours the outlines of patients in stereotactic headframes, a situation most often requiring manual drawing. The relative entropy of intensities inside the contour (patient) versus outside (background) was maximized by conjugate gradient descent over the space of parameters of a deformable contour. shows the computed segmentation of a patient from headframe backgrounds. This program is particularly useful for preparing images for multimodal image fusion. Relative entropy and allied measures of distribution similarity provide automated contouring criteria that do not depend on statistical models of image data. This approach should have wide utility in medical image segmentation applications. Copyright (2001) Australasian College of Physical Scientists and

  2. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  3. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  4. Investigating Semi-Automated Cadastral Boundaries Extraction from Airborne Laser Scanned Data

    Directory of Open Access Journals (Sweden)

    Xianghuan Luo

    2017-09-01

    Full Text Available Many developing countries have witnessed the urgent need of accelerating cadastral surveying processes. Previous studies found that large portions of cadastral boundaries coincide with visible physical objects, namely roads, fences, and building walls. This research explores the application of airborne laser scanning (ALS techniques on cadastral surveys. A semi-automated workflow is developed to extract cadastral boundaries from an ALS point clouds. Firstly, a two-phased workflow was developed that focused on extracting digital representations of physical objects. In the automated extraction phase, after classifying points into semantic components, the outline of planar objects such as building roofs and road surfaces were generated by an α-shape algorithm, whilst the centerlines delineatiation approach was fitted into the lineate object—a fence. Afterwards, the extracted vector lines were edited and refined during the post-refinement phase. Secondly, we quantitatively evaluated the workflow performance by comparing results against an exiting cadastral map as reference. It was found that the workflow achieved promising results: around 80% completeness and 60% correctness on average, although the spatial accuracy is still modest. It is argued that the semi-automated extraction workflow could effectively speed up cadastral surveying, with both human resources and equipment costs being reduced

  5. Automated oil spill detection with multispectral imagery

    Science.gov (United States)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  6. Automated search method for AFM and profilers

    Science.gov (United States)

    Ray, Michael; Martin, Yves C.

    2001-08-01

    A new automation software creates a search model as an initial setup and searches for a user-defined target in atomic force microscopes or stylus profilometers used in semiconductor manufacturing. The need for such automation has become critical in manufacturing lines. The new method starts with a survey map of a small area of a chip obtained from a chip-design database or an image of the area. The user interface requires a user to point to and define a precise location to be measured, and to select a macro function for an application such as line width or contact hole. The search algorithm automatically constructs a range of possible scan sequences within the survey, and provides increased speed and functionality compared to the methods used in instruments to date. Each sequence consists in a starting point relative to the target, a scan direction, and a scan length. The search algorithm stops when the location of a target is found and criteria for certainty in positioning is met. With today's capability in high speed processing and signal control, the tool can simultaneously scan and search for a target in a robotic and continuous manner. Examples are given that illustrate the key concepts.

  7. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  8. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  9. Automated Orthorectification of VHR Satellite Images by SIFT-Based RPC Refinement

    Directory of Open Access Journals (Sweden)

    Hakan Kartal

    2018-06-01

    Full Text Available Raw remotely sensed images contain geometric distortions and cannot be used directly for map-based applications, accurate locational information extraction or geospatial data integration. A geometric correction process must be conducted to minimize the errors related to distortions and achieve the desired location accuracy before further analysis. A considerable number of images might be needed when working over large areas or in temporal domains in which manual geometric correction requires more labor and time. To overcome these problems, new algorithms have been developed to make the geometric correction process autonomous. The Scale Invariant Feature Transform (SIFT algorithm is an image matching algorithm used in remote sensing applications that has received attention in recent years. In this study, the effects of the incidence angle, surface topography and land cover (LC characteristics on SIFT-based automated orthorectification were investigated at three different study sites with different topographic conditions and LC characteristics using Pleiades very high resolution (VHR images acquired at different incidence angles. The results showed that the location accuracy of the orthorectified images increased with lower incidence angle images. More importantly, the topographic characteristics had no observable impacts on the location accuracy of SIFT-based automated orthorectification, and the results showed that Ground Control Points (GCPs are mainly concentrated in the “Forest” and “Semi Natural Area” LC classes. A multi-thread code was designed to reduce the automated processing time, and the results showed that the process performed 7 to 16 times faster using an automated approach. Analyses performed on various spectral modes of multispectral data showed that the arithmetic data derived from pan-sharpened multispectral images can be used in automated SIFT-based RPC orthorectification.

  10. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  11. Automated real-time testing (ARTT) for embedded control systems (ECS)

    International Nuclear Information System (INIS)

    Hawkins, J; Howard, R; Nguyen, H.

    2001-01-01

    Many of today's automated real-time testing systems for embedded systems were developed using expensive custom hardware and software. In this article they describe how to use commercially available off-the-shelf hardware and software to design and develop an automated real-time test systems for Embedded Programmable Logic Controller (PLC) Based Control Systems. The system development began with the implementation of the VALI/TEST Pro testing methodology as a means for structuring the testing. Using this methodology, they were able to decompose system requirement documents for a Personnel Safety System (PSS) into its high, intermediate and detail level requirements. next, the validation procedures for the PSS system were decomposed into testing units called builds, test runs and test cases. To measure the PSS system's test coverage three levels of system requirements were mapped to their respective unit level of test using a specially constructed validation matrix that was designed to handle over 150 test cases and requirements. All of the above work led to the development of an Automated Real-Time Test System (ARTTS) that is capable of performing complete black box testing in real-time for Embedded PLC Based Control Systems. Also note, that the PSS system under test and mentioned in this paper is located at the Advance Photon Source (APS) at Argonne National Laboratory Basic Energy Science Facility in Argonne, Illinois

  12. LARGE-SCALE INDICATIVE MAPPING OF SOIL RUNOFF

    Directory of Open Access Journals (Sweden)

    E. Panidi

    2017-11-01

    Full Text Available In our study we estimate relationships between quantitative parameters of relief, soil runoff regime, and spatial distribution of radioactive pollutants in the soil. The study is conducted on the test arable area located in basin of the upper Oka River (Orel region, Russia. Previously we collected rich amount of soil samples, which make it possible to investigate redistribution of the Chernobyl-origin cesium-137 in soil material and as a consequence the soil runoff magnitude at sampling points. Currently we are describing and discussing the technique applied to large-scale mapping of the soil runoff. The technique is based upon the cesium-137 radioactivity measurement in the different relief structures. Key stages are the allocation of the places for soil sampling points (we used very high resolution space imagery as a supporting data; soil samples collection and analysis; calibration of the mathematical model (using the estimated background value of the cesium-137 radioactivity; and automated compilation of the map (predictive map of the studied territory (digital elevation model is used for this purpose, and cesium-137 radioactivity can be predicted using quantitative parameters of the relief. The maps can be used as a support data for precision agriculture and for recultivation or melioration purposes.

  13. Automated Temperature and Emission Measure Analysis of Coronal Loops and Active Regions Observed with the Atmospheric Imaging Assembly on the Solar Dynamics Observatory (SDO/AIA)

    Science.gov (United States)

    Aschwanden, Markus J.; Boerner, Paul; Schrijver, Carolus J.; Malanushenko, Anna

    2013-03-01

    We developed numerical codes designed for automated analysis of SDO/AIA image datasets in the six coronal filters, including: i) coalignment test between different wavelengths with measurements of the altitude of the EUV-absorbing chromosphere, ii) self-calibration by empirical correction of instrumental response functions, iii) automated generation of differential emission measure [DEM] distributions with peak-temperature maps [ T p( x, y)] and emission measure maps [ EM p( x, y)] of the full Sun or active region areas, iv) composite DEM distributions [d EM( T)/d T] of active regions or subareas, v) automated detection of coronal loops, and vi) automated background subtraction and thermal analysis of coronal loops, which yields statistics of loop temperatures [ T e], temperature widths [ σ T], emission measures [ EM], electron densities [ n e], and loop widths [ w]. The combination of these numerical codes allows for automated and objective processing of numerous coronal loops. As an example, we present the results of an application to the active region NOAA 11158, observed on 15 February 2011, shortly before it produced the largest (X2.2) flare during the current solar cycle. We detect 570 loop segments at temperatures in the entire range of log( T e)=5.7 - 7.0 K and corroborate previous TRACE and AIA results on their near-isothermality and the validity of the Rosner-Tucker-Vaiana (RTV) law at soft X-ray temperatures ( T≳2 MK) and its failure at lower EUV temperatures.

  14. 5th International Conference on Automation, Robotics and Applications (ICARA 2011)

    CERN Document Server

    Bailey, Donald; Demidenko, Serge; Carnegie, Dale; Recent Advances in Robotics and Automation

    2013-01-01

    There isn’t a facet of human life that has not been touched and influenced by robots and automation. What makes robots and machines versatile is their computational intelligence. While modern intelligent sensors and powerful hardware capabilities have given a huge fillip to the growth of intelligent machines, the progress in the development of algorithms for smart interaction, collaboration and pro-activeness will result in the next quantum jump. This book deals with the recent advancements in design methodologies, algorithms and implementation techniques to incorporate intelligence in robots and automation systems. Several articles deal with navigation, localization and mapping of mobile robots, a problem that engineers and researchers are grappling with all the time. Fuzzy logic, neural networks and neuro-fuzzy based techniques for real world applications have been detailed in a few articles. This edited volume is targeted to present the latest state-of-the-art computational intelligence techniques in Rob...

  15. Improved predictive mapping of indoor radon concentrations using ensemble regression trees based on automatic clustering of geological units

    International Nuclear Information System (INIS)

    Kropat, Georg; Bochud, Francois; Jaboyedoff, Michel; Laedermann, Jean-Pascal; Murith, Christophe; Palacios, Martha; Baechler, Sébastien

    2015-01-01

    Purpose: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. Method: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). Results: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. Conclusion: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables

  16. Dynamic mapping of EDDL device descriptions to OPC UA

    Science.gov (United States)

    Atta Nsiah, Kofi; Schappacher, Manuel; Sikora, Axel

    2017-07-01

    OPC UA (Open Platform Communications Unified Architecture) is already a well-known concept used widely in the automation industry. In the area of factory automation, OPC UA models the underlying field devices such as sensors and actuators in an OPC UA server to allow connecting OPC UA clients to access device-specific information via a standardized information model. One of the requirements of the OPC UA server to represent field device data using its information model is to have advanced knowledge about the properties of the field devices in the form of device descriptions. The international standard IEC 61804 specifies EDDL (Electronic Device Description Language) as a generic language for describing the properties of field devices. In this paper, the authors describe a possibility to dynamically map and integrate field device descriptions based on EDDL into OPCUA.

  17. An automated land-use mapping comparison of the Bayesian maximum likelihood and linear discriminant analysis algorithms

    Science.gov (United States)

    Tom, C. H.; Miller, L. D.

    1984-01-01

    The Bayesian maximum likelihood parametric classifier has been tested against the data-based formulation designated 'linear discrimination analysis', using the 'GLIKE' decision and "CLASSIFY' classification algorithms in the Landsat Mapping System. Identical supervised training sets, USGS land use/land cover classes, and various combinations of Landsat image and ancilliary geodata variables, were used to compare the algorithms' thematic mapping accuracy on a single-date summer subscene, with a cellularized USGS land use map of the same time frame furnishing the ground truth reference. CLASSIFY, which accepts a priori class probabilities, is found to be more accurate than GLIKE, which assumes equal class occurrences, for all three mapping variable sets and both levels of detail. These results may be generalized to direct accuracy, time, cost, and flexibility advantages of linear discriminant analysis over Bayesian methods.

  18. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  19. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  20. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  1. Automated whole-genome multiple alignment of rat, mouse, and human

    Energy Technology Data Exchange (ETDEWEB)

    Brudno, Michael; Poliakov, Alexander; Salamov, Asaf; Cooper, Gregory M.; Sidow, Arend; Rubin, Edward M.; Solovyev, Victor; Batzoglou, Serafim; Dubchak, Inna

    2004-07-04

    We have built a whole genome multiple alignment of the three currently available mammalian genomes using a fully automated pipeline which combines the local/global approach of the Berkeley Genome Pipeline and the LAGAN program. The strategy is based on progressive alignment, and consists of two main steps: (1) alignment of the mouse and rat genomes; and (2) alignment of human to either the mouse-rat alignments from step 1, or the remaining unaligned mouse and rat sequences. The resulting alignments demonstrate high sensitivity, with 87% of all human gene-coding areas aligned in both mouse and rat. The specificity is also high: <7% of the rat contigs are aligned to multiple places in human and 97% of all alignments with human sequence > 100kb agree with a three-way synteny map built independently using predicted exons in the three genomes. At the nucleotide level <1% of the rat nucleotides are mapped to multiple places in the human sequence in the alignment; and 96.5% of human nucleotides within all alignments agree with the synteny map. The alignments are publicly available online, with visualization through the novel Multi-VISTA browser that we also present.

  2. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  3. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    Science.gov (United States)

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  4. Cross-terminology mapping challenges: A demonstration using medication terminological systems

    Science.gov (United States)

    Saitwal, Himali; Qing, David; Jones, Stephen; Bernstam, Elmer; Chute, Christopher G.; Johnson, Todd R.

    2015-01-01

    Standardized terminological systems for biomedical information have provided considerable benefits to biomedical applications and research. However, practical use of this information often requires mapping across terminological systems—a complex and time-consuming process. This paper demonstrates the complexity and challenges of mapping across terminological systems in the context of medication information. It provides a review of medication terminological systems and their linkages, then describes a case study in which we mapped proprietary medication codes from an electronic health record to SNOMED-CT and the UMLS Metathesaurus. The goal was to create a polyhierarchical classification system for querying an i2b2 clinical data warehouse. We found that three methods were required to accurately map the majority of actively prescribed medications. Only 62.5% of source medication codes could be mapped automatically. The remaining codes were mapped using a combination of semi-automated string comparison with expert selection, and a completely manual approach. Compound drugs were especially difficult to map: only 7.5% could be mapped using the automatic method. General challenges to mapping across terminological systems include (1) the availability of up-to-date information to assess the suitability of a given terminological system for a particular use case, and to assess the quality and completeness of cross-terminology links; (2) the difficulty of correctly using complex, rapidly evolving, modern terminologies; (3) the time and effort required to complete and evaluate the mapping; (4) the need to address differences in granularity between the source and target terminologies; and (5) the need to continuously update the mapping as terminological systems evolve. PMID:22750536

  5. Automated and model-based assembly of an anamorphic telescope

    Science.gov (United States)

    Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.

  6. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  7. Automated Recognition of Vegetation and Water Bodies on the Territory of Megacities in Satellite Images of Visible and IR Bands

    Science.gov (United States)

    Mozgovoy, Dmitry k.; Hnatushenko, Volodymyr V.; Vasyliev, Volodymyr V.

    2018-04-01

    Vegetation and water bodies are a fundamental element of urban ecosystems, and water mapping is critical for urban and landscape planning and management. A methodology of automated recognition of vegetation and water bodies on the territory of megacities in satellite images of sub-meter spatial resolution of the visible and IR bands is proposed. By processing multispectral images from the satellite SuperView-1A, vector layers of recognized plant and water objects were obtained. Analysis of the results of image processing showed a sufficiently high accuracy of the delineation of the boundaries of recognized objects and a good separation of classes. The developed methodology provides a significant increase of the efficiency and reliability of updating maps of large cities while reducing financial costs. Due to the high degree of automation, the proposed methodology can be implemented in the form of a geo-information web service functioning in the interests of a wide range of public services and commercial institutions.

  8. Mapping Urban Ecosystem Services Using High Resolution Aerial Photography

    Science.gov (United States)

    Pilant, A. N.; Neale, A.; Wilhelm, D.

    2010-12-01

    Ecosystem services (ES) are the many life-sustaining benefits we receive from nature: e.g., clean air and water, food and fiber, cultural-aesthetic-recreational benefits, pollination and flood control. The ES concept is emerging as a means of integrating complex environmental and economic information to support informed environmental decision making. The US EPA is developing a web-based National Atlas of Ecosystem Services, with a component for urban ecosystems. Currently, the only wall-to-wall, national scale land cover data suitable for this analysis is the National Land Cover Data (NLCD) at 30 m spatial resolution with 5 and 10 year updates. However, aerial photography is acquired at higher spatial resolution (0.5-3 m) and more frequently (1-5 years, typically) for most urban areas. Land cover was mapped in Raleigh, NC using freely available USDA National Agricultural Imagery Program (NAIP) with 1 m ground sample distance to test the suitability of aerial photography for urban ES analysis. Automated feature extraction techniques were used to extract five land cover classes, and an accuracy assessment was performed using standard techniques. Results will be presented that demonstrate applications to mapping ES in urban environments: greenways, corridors, fragmentation, habitat, impervious surfaces, dark and light pavement (urban heat island). Automated feature extraction results mapped over NAIP color aerial photograph. At this scale, we can look at land cover and related ecosystem services at the 2-10 m scale. Small features such as individual trees and sidewalks are visible and mappable. Classified aerial photo of Downtown Raleigh NC Red: impervious surface Dark Green: trees Light Green: grass Tan: soil

  9. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  10. Detailed Maps Depicting the Shallow-Water Benthic Habitats of the Northwestern Hawaiian Islands Derived from High Resolution IKONOS Satellite Imagery

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Detailed, shallow-water coral reef ecosystem maps were generated by rule-based, semi-automated image analysis of high-resolution satellite imagery for nine locations...

  11. A Visual Framework for Digital Reconstruction of Topographic Maps

    KAUST Repository

    Thabet, Ali Kassem

    2014-09-30

    We present a framework for reconstructing Digital Elevation Maps (DEM) from scanned topographic maps. We first rectify the images to ensure that maps fit together without distortion. To segment iso-contours, we have developed a novel semi-automated method based on mean-shifts that requires only minimal user interaction. Contour labels are automatically read using an OCR module. To reconstruct the output DEM from scattered data, we generalize natural neighbor interpolation to handle the transfinite case (contours and points). To this end, we use parallel vector propagation to compute a discrete Voronoi diagram of the constraints, and a modified floodfill to compute virtual Voronoi tiles. Our framework is able to handle tens of thousands of contours and points and can generate DEMs comprising more than 100 million samples. We provide quantitative comparison to commercial software and show the benefits of our approach. We furthermore show the robustness of our method on a massive set of old maps predating satellite acquisition. Compared to other methods, our framework is able to accurately and efficiently generate a final DEM despite inconsistencies, sparse or missing contours even for highly complex and cluttered maps. Therefore, this method has broad applicability for digitization and reconstruction of the world\\'s old topographic maps that are often the only record of past landscapess and cultural heritage before their destruction under modern development.

  12. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  13. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  14. Superpixel-based and boundary-sensitive convolutional neural network for automated liver segmentation

    Science.gov (United States)

    Qin, Wenjian; Wu, Jia; Han, Fei; Yuan, Yixuan; Zhao, Wei; Ibragimov, Bulat; Gu, Jia; Xing, Lei

    2018-05-01

    Segmentation of liver in abdominal computed tomography (CT) is an important step for radiation therapy planning of hepatocellular carcinoma. Practically, a fully automatic segmentation of liver remains challenging because of low soft tissue contrast between liver and its surrounding organs, and its highly deformable shape. The purpose of this work is to develop a novel superpixel-based and boundary sensitive convolutional neural network (SBBS-CNN) pipeline for automated liver segmentation. The entire CT images were first partitioned into superpixel regions, where nearby pixels with similar CT number were aggregated. Secondly, we converted the conventional binary segmentation into a multinomial classification by labeling the superpixels into three classes: interior liver, liver boundary, and non-liver background. By doing this, the boundary region of the liver was explicitly identified and highlighted for the subsequent classification. Thirdly, we computed an entropy-based saliency map for each CT volume, and leveraged this map to guide the sampling of image patches over the superpixels. In this way, more patches were extracted from informative regions (e.g. the liver boundary with irregular changes) and fewer patches were extracted from homogeneous regions. Finally, deep CNN pipeline was built and trained to predict the probability map of the liver boundary. We tested the proposed algorithm in a cohort of 100 patients. With 10-fold cross validation, the SBBS-CNN achieved mean Dice similarity coefficients of 97.31  ±  0.36% and average symmetric surface distance of 1.77  ±  0.49 mm. Moreover, it showed superior performance in comparison with state-of-art methods, including U-Net, pixel-based CNN, active contour, level-sets and graph-cut algorithms. SBBS-CNN provides an accurate and effective tool for automated liver segmentation. It is also envisioned that the proposed framework is directly applicable in other medical image segmentation scenarios.

  15. Semi-Automated Approach for Mapping Urban Trees from Integrated Aerial LiDAR Point Cloud and Digital Imagery Datasets

    Science.gov (United States)

    Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-09-01

    Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  16. Automated kidney morphology measurements from ultrasound images using texture and edge analysis

    Science.gov (United States)

    Ravishankar, Hariharan; Annangi, Pavan; Washburn, Michael; Lanning, Justin

    2016-04-01

    In a typical ultrasound scan, a sonographer measures Kidney morphology to assess renal abnormalities. Kidney morphology can also help to discriminate between chronic and acute kidney failure. The caliper placements and volume measurements are often time consuming and an automated solution will help to improve accuracy, repeatability and throughput. In this work, we developed an automated Kidney morphology measurement solution from long axis Ultrasound scans. Automated kidney segmentation is challenging due to wide variability in kidney shape, size, weak contrast of the kidney boundaries and presence of strong edges like diaphragm, fat layers. To address the challenges and be able to accurately localize and detect kidney regions, we present a two-step algorithm that makes use of edge and texture information in combination with anatomical cues. First, we use an edge analysis technique to localize kidney region by matching the edge map with predefined templates. To accurately estimate the kidney morphology, we use textural information in a machine learning algorithm framework using Haar features and Gradient boosting classifier. We have tested the algorithm on 45 unseen cases and the performance against ground truth is measured by computing Dice overlap, % error in major and minor axis of kidney. The algorithm shows successful performance on 80% cases.

  17. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. An automated A-value measurement tool for accurate cochlear duct length estimation.

    Science.gov (United States)

    Iyaniwura, John E; Elfarnawany, Mai; Ladak, Hanif M; Agrawal, Sumit K

    2018-01-22

    There has been renewed interest in the cochlear duct length (CDL) for preoperative cochlear implant electrode selection and postoperative generation of patient-specific frequency maps. The CDL can be estimated by measuring the A-value, which is defined as the length between the round window and the furthest point on the basal turn. Unfortunately, there is significant intra- and inter-observer variability when these measurements are made clinically. The objective of this study was to develop an automated A-value measurement algorithm to improve accuracy and eliminate observer variability. Clinical and micro-CT images of 20 cadaveric cochleae specimens were acquired. The micro-CT of one sample was chosen as the atlas, and A-value fiducials were placed onto that image. Image registration (rigid affine and non-rigid B-spline) was applied between the atlas and the 19 remaining clinical CT images. The registration transform was applied to the A-value fiducials, and the A-value was then automatically calculated for each specimen. High resolution micro-CT images of the same 19 specimens were used to measure the gold standard A-values for comparison against the manual and automated methods. The registration algorithm had excellent qualitative overlap between the atlas and target images. The automated method eliminated the observer variability and the systematic underestimation by experts. Manual measurement of the A-value on clinical CT had a mean error of 9.5 ± 4.3% compared to micro-CT, and this improved to an error of 2.7 ± 2.1% using the automated algorithm. Both the automated and manual methods correlated significantly with the gold standard micro-CT A-values (r = 0.70, p value measurement tool using atlas-based registration methods was successfully developed and validated. The automated method eliminated the observer variability and improved accuracy as compared to manual measurements by experts. This open-source tool has the potential to benefit

  19. Automated selection of brain regions for real-time fMRI brain-computer interfaces

    Science.gov (United States)

    Lührs, Michael; Sorger, Bettina; Goebel, Rainer; Esposito, Fabrizio

    2017-02-01

    Objective. Brain-computer interfaces (BCIs) implemented with real-time functional magnetic resonance imaging (rt-fMRI) use fMRI time-courses from predefined regions of interest (ROIs). To reach best performances, localizer experiments and on-site expert supervision are required for ROI definition. To automate this step, we developed two unsupervised computational techniques based on the general linear model (GLM) and independent component analysis (ICA) of rt-fMRI data, and compared their performances on a communication BCI. Approach. 3 T fMRI data of six volunteers were re-analyzed in simulated real-time. During a localizer run, participants performed three mental tasks following visual cues. During two communication runs, a letter-spelling display guided the subjects to freely encode letters by performing one of the mental tasks with a specific timing. GLM- and ICA-based procedures were used to decode each letter, respectively using compact ROIs and whole-brain distributed spatio-temporal patterns of fMRI activity, automatically defined from subject-specific or group-level maps. Main results. Letter-decoding performances were comparable to supervised methods. In combination with a similarity-based criterion, GLM- and ICA-based approaches successfully decoded more than 80% (average) of the letters. Subject-specific maps yielded optimal performances. Significance. Automated solutions for ROI selection may help accelerating the translation of rt-fMRI BCIs from research to clinical applications.

  20. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  1. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  2. MAPPING ERODED AREAS ON MOUNTAIN GRASSLAND WITH TERRESTRIAL PHOTOGRAMMETRY AND OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. Mayr

    2016-06-01

    Full Text Available In the Alps as well as in other mountain regions steep grassland is frequently affected by shallow erosion. Often small landslides or snow movements displace the vegetation together with soil and/or unconsolidated material. This results in bare earth surface patches within the grass covered slope. Close-range and remote sensing techniques are promising for both mapping and monitoring these eroded areas. This is essential for a better geomorphological process understanding, to assess past and recent developments, and to plan mitigation measures. Recent developments in image matching techniques make it feasible to produce high resolution orthophotos and digital elevation models from terrestrial oblique images. In this paper we propose to delineate the boundary of eroded areas for selected scenes of a study area, using close-range photogrammetric data. Striving for an efficient, objective and reproducible workflow for this task, we developed an approach for automated classification of the scenes into the classes grass and eroded. We propose an object-based image analysis (OBIA workflow which consists of image segmentation and automated threshold selection for classification using the Excess Green Vegetation Index (ExG. The automated workflow is tested with ten different scenes. Compared to a manual classification, grass and eroded areas are classified with an overall accuracy between 90.7% and 95.5%, depending on the scene. The methods proved to be insensitive to differences in illumination of the scenes and greenness of the grass. The proposed workflow reduces user interaction and is transferable to other study areas. We conclude that close-range photogrammetry is a valuable low-cost tool for mapping this type of eroded areas in the field with a high level of detail and quality. In future, the output will be used as ground truth for an area-wide mapping of eroded areas in coarser resolution aerial orthophotos acquired at the same time.

  3. Local search for optimal global map generation using mid-decadal landsat images

    Science.gov (United States)

    Khatib, L.; Gasch, J.; Morris, Robert; Covington, S.

    2007-01-01

    NASA and the US Geological Survey (USGS) are seeking to generate a map of the entire globe using Landsat 5 Thematic Mapper (TM) and Landsat 7 Enhanced Thematic Mapper Plus (ETM+) sensor data from the "mid-decadal" period of 2004 through 2006. The global map is comprised of thousands of scene locations and, for each location, tens of different images of varying quality to chose from. Furthermore, it is desirable for images of adjacent scenes be close together in time of acquisition, to avoid obvious discontinuities due to seasonal changes. These characteristics make it desirable to formulate an automated solution to the problem of generating the complete map. This paper formulates a Global Map Generator problem as a Constraint Optimization Problem (GMG-COP) and describes an approach to solving it using local search. Preliminary results of running the algorithm on image data sets are summarized. The results suggest a significant improvement in map quality using constraint-based solutions. Copyright ?? 2007, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.

  4. 78 FR 44142 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-07-23

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... (CBP's) plan to modify the National Customs Automation Program (NCAP) tests concerning document imaging... entry process by reducing the number of data elements required to obtain release for cargo transported...

  5. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  6. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  7. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  8. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  9. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  10. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  11. A new method for automated high-dimensional lesion segmentation evaluated in vascular injury and applied to the human occipital lobe.

    Science.gov (United States)

    Mah, Yee-Haur; Jager, Rolf; Kennard, Christopher; Husain, Masud; Nachev, Parashkev

    2014-07-01

    Making robust inferences about the functional neuroanatomy of the brain is critically dependent on experimental techniques that examine the consequences of focal loss of brain function. Unfortunately, the use of the most comprehensive such technique-lesion-function mapping-is complicated by the need for time-consuming and subjective manual delineation of the lesions, greatly limiting the practicability of the approach. Here we exploit a recently-described general measure of statistical anomaly, zeta, to devise a fully-automated, high-dimensional algorithm for identifying the parameters of lesions within a brain image given a reference set of normal brain images. We proceed to evaluate such an algorithm in the context of diffusion-weighted imaging of the commonest type of lesion used in neuroanatomical research: ischaemic damage. Summary performance metrics exceed those previously published for diffusion-weighted imaging and approach the current gold standard-manual segmentation-sufficiently closely for fully-automated lesion-mapping studies to become a possibility. We apply the new method to 435 unselected images of patients with ischaemic stroke to derive a probabilistic map of the pattern of damage in lesions involving the occipital lobe, demonstrating the variation of anatomical resolvability of occipital areas so as to guide future lesion-function studies of the region. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Detailed Maps Depicting the Shallow-Water Benthic Habitats of the Northwestern Hawaiian Islands Derived from High Resolution IKONOS Satellite Imagery (Draft)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Detailed, shallow-water coral reef ecosystem maps were generated by rule-based, semi-automated image analysis of high-resolution satellite imagery for nine locations...

  13. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  14. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  15. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  16. Selection of Filtration Methods in the Analysis of Motion of Automated Guided Vehicle

    Directory of Open Access Journals (Sweden)

    Dobrzańska Magdalena

    2016-08-01

    Full Text Available In this article the issues related to mapping the route and error correction in automated guided vehicle (AGV movement have been discussed. The nature and size of disruption have been determined using the registered runs in experimental studies. On the basis of the analysis a number of numerical runs have been generated, which mapped possible to obtain runs in a real movement of the vehicle. The obtained data set has been used for further research. The aim of this paper was to test the selected methods of digital filtering on the same data set and determine their effectiveness. The results of simulation studies have been presented in the article. The effectiveness of various methods has been determined and on this basis the conclusions have been drawn.

  17. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  18. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Science.gov (United States)

    2011-06-13

    ... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In... Customs Automation Program (NCAP) test relating to highway movements of commercial goods that are transported in-bond through the United States from one point in Canada to another point in Canada. The NCAP...

  19. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  20. What is missing? An operational inundation mapping framework by SAR data

    Science.gov (United States)

    Shen, X.; Anagnostou, E. N.; Zeng, Z.; Kettner, A.; Hong, Y.

    2017-12-01

    Compared to optical sensors, synthetic aperture radar (SAR) works all-day all-weather. In addition, its spatial resolution does not decrease with the height of the platform and is thus applicable to a range of important studies. However, existing studies did not address the operational demands of real-time inundation mapping. The direct proof is that no water body product exists for any SAR-based satellites. Then what is missing between science and products? Automation and quality. What makes it so difficult to develop an operational inundation mapping technique based on SAR data? Spectrum-wise, unlike optical water indices such as MNDWI, AWEI etc., where a relative constant threshold may apply across acquisition of images, regions and sensors, the threshold to separate water from non-water pixels in each SAR images has to be individually chosen. The optimization of the threshold is the first obstacle to the automation of the SAR data algorithm. Morphologically, the quality and reliability of the results have been compromised by over-detection caused by smooth surface and shadowing area, the noise-like speckle and under-detection caused by strong-scatter disturbance. In this study, we propose a three-step framework that addresses all aforementioned issues of operational inundation mapping by SAR data. The framework consists of 1) optimization of Wishart distribution parameters of single/dual/fully-polarized SAR data, 2) morphological removal of over-detection, and 3) machine-learning based removal of under-detection. The framework utilizes not only the SAR data, but also the synergy of digital elevation model (DEM), and optical sensor-based products of fine resolution, including the water probability map, land cover classification map (optional), and river width. The framework has been validated throughout multiple areas in different parts of the world using different satellite SAR data and globally available ancillary data products. Therefore, it has the potential

  1. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    Science.gov (United States)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  2. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  3. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  4. From Trust in Automation to Decision Neuroscience: Applying Cognitive Neuroscience Methods to Understand and Improve Interaction Decisions Involved in Human Automation Interaction

    Science.gov (United States)

    Drnec, Kim; Marathe, Amar R.; Lukos, Jamie R.; Metcalfe, Jason S.

    2016-01-01

    Human automation interaction (HAI) systems have thus far failed to live up to expectations mainly because human users do not always interact with the automation appropriately. Trust in automation (TiA) has been considered a central influence on the way a human user interacts with an automation; if TiA is too high there will be overuse, if TiA is too low there will be disuse. However, even though extensive research into TiA has identified specific HAI behaviors, or trust outcomes, a unique mapping between trust states and trust outcomes has yet to be clearly identified. Interaction behaviors have been intensely studied in the domain of HAI and TiA and this has led to a reframing of the issues of problems with HAI in terms of reliance and compliance. We find the behaviorally defined terms reliance and compliance to be useful in their functionality for application in real-world situations. However, we note that once an inappropriate interaction behavior has occurred it is too late to mitigate it. We therefore take a step back and look at the interaction decision that precedes the behavior. We note that the decision neuroscience community has revealed that decisions are fairly stereotyped processes accompanied by measurable psychophysiological correlates. Two literatures were therefore reviewed. TiA literature was extensively reviewed in order to understand the relationship between TiA and trust outcomes, as well as to identify gaps in current knowledge. We note that an interaction decision precedes an interaction behavior and believe that we can leverage knowledge of the psychophysiological correlates of decisions to improve joint system performance. As we believe that understanding the interaction decision will be critical to the eventual mitigation of inappropriate interaction behavior, we reviewed the decision making literature and provide a synopsis of the state of the art understanding of the decision process from a decision neuroscience perspective. We forward

  5. From Trust in Automation to Decision Neuroscience: Applying Cognitive Neuroscience Methods to Understand and Improve Interaction Decisions Involved in Human Automation Interaction.

    Science.gov (United States)

    Drnec, Kim; Marathe, Amar R; Lukos, Jamie R; Metcalfe, Jason S

    2016-01-01

    Human automation interaction (HAI) systems have thus far failed to live up to expectations mainly because human users do not always interact with the automation appropriately. Trust in automation (TiA) has been considered a central influence on the way a human user interacts with an automation; if TiA is too high there will be overuse, if TiA is too low there will be disuse. However, even though extensive research into TiA has identified specific HAI behaviors, or trust outcomes, a unique mapping between trust states and trust outcomes has yet to be clearly identified. Interaction behaviors have been intensely studied in the domain of HAI and TiA and this has led to a reframing of the issues of problems with HAI in terms of reliance and compliance. We find the behaviorally defined terms reliance and compliance to be useful in their functionality for application in real-world situations. However, we note that once an inappropriate interaction behavior has occurred it is too late to mitigate it. We therefore take a step back and look at the interaction decision that precedes the behavior. We note that the decision neuroscience community has revealed that decisions are fairly stereotyped processes accompanied by measurable psychophysiological correlates. Two literatures were therefore reviewed. TiA literature was extensively reviewed in order to understand the relationship between TiA and trust outcomes, as well as to identify gaps in current knowledge. We note that an interaction decision precedes an interaction behavior and believe that we can leverage knowledge of the psychophysiological correlates of decisions to improve joint system performance. As we believe that understanding the interaction decision will be critical to the eventual mitigation of inappropriate interaction behavior, we reviewed the decision making literature and provide a synopsis of the state of the art understanding of the decision process from a decision neuroscience perspective. We forward

  6. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  7. Aspectos de especificação e implementação da camada de apresentação do padrão map utilizando o ambiente epos

    OpenAIRE

    Paulo Cesar Minoru Inazumi

    1990-01-01

    Resumo: Este trabalho apresenta a implementacão do prolocolo de apresentacão do modelo OSI/ISO para o projelo SISDI-MAP (Sislema Didático MAP). Utilizou-se o ambiente EPOS ("Engineering Project Oriented System") como suporte de desenvolvimento principalmente para a especificação formal da implementação e geração automática de código. Apresentam-se os conceitos referentes à camada de apresentação, o modelo de implementação, alguns aspectos da implementação e da geração automática de código e, ...

  8. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    Science.gov (United States)

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between

  9. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  10. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  11. A Study on the Cost-Effectiveness of a SemiAutomated Cutting Process at a Garment Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Castro, Mark Daniel

    2017-11-01

    Full Text Available The subject of the study, Company X, has been experiencing variations in the quantity report from the cutting department and the transmittal reports. The management found that these processes are hugely affected by manual labor. To reduce the system's proneness to human error, the management decided to explore the possibility of adapting a semi-automated spreading and cutting process in the system. This research aims to evaluate the pre-sewing processes of Company X and whether introducing automation can be beneficial to the company and the garments industry. The researchers used process mapping tools, descriptive research, and process flowchart to assess the current and proposed systems, and engineering economics to evaluate the cost and benefits of implementing the semi-automated system. The results showed that with the implementation of the semi- automated system; the company will incur 66.61% more savings per year than the current system. In terms of cycle time, the semi-automated system eliminated the relaxation of fabric before the cutting process, thereby greatly reducing cycle time. In addition, the researchers found that as long as the company produce more than 4,140 pieces per day for the system will be economically feasible. Unquantifiable benefits are also identified on introducing the semi- automated system to the company. The company can have a cleaner work environment that will lead to more productivity and greater quality of goods. This will lead to a better company image that will encourage more customers to place job orders.

  12. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  13. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  14. The automated reference toolset: A soil-geomorphic ecological potential matching algorithm

    Science.gov (United States)

    Nauman, Travis; Duniway, Michael C.

    2016-01-01

    Ecological inventory and monitoring data need referential context for interpretation. Identification of appropriate reference areas of similar ecological potential for site comparison is demonstrated using a newly developed automated reference toolset (ART). Foundational to identification of reference areas was a soil map of particle size in the control section (PSCS), a theme in US Soil Taxonomy. A 30-m resolution PSCS map of the Colorado Plateau (366,000 km2) was created by interpolating ∼5000 field soil observations using a random forest model and a suite of raster environmental spatial layers representing topography, climate, general ecological community, and satellite imagery ratios. The PSCS map had overall out of bag accuracy of 61.8% (Kappa of 0.54, p < 0.0001), and an independent validation accuracy of 93.2% at a set of 356 field plots along the southern edge of Canyonlands National Park, Utah. The ART process was also tested at these plots, and matched plots with the same ecological sites (ESs) 67% of the time where sites fell within 2-km buffers of each other. These results show that the PSCS and ART have strong application for ecological monitoring and sampling design, as well as assessing impacts of disturbance and land management action using an ecological potential framework. Results also demonstrate that PSCS could be a key mapping layer for the USDA-NRCS provisional ES development initiative.

  15. Haven't a Cue? Mapping the CUE Space as an Aid to HRA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; Ronald L Boring; Jacques Hugo; William Phoenix

    2012-06-01

    Advances in automation present a new modeling environment for the human reliability analysis (HRA) practitioner. Many, if not most, current day HRA methods have their origin in characterizing and quantifying human performance in analog environments where mode awareness and system status indications are potentially less comprehensive, but simpler to comprehend at a glance when compared to advanced presentation systems. The introduction of highly complex automation has the potential to lead to: decreased levels of situation awareness caused by the need for increased monitoring; confusion regarding the often non-obvious causes of automation failures, and emergent system dependencies that formerly may have been uncharacterized. Understanding the relation of incoming cues available to operators during plant upset conditions, in conjunction with operating procedures, yields insight into understanding the nature of the expected operator response in this control room environment. Static systems methods such as fault trees do not contain the appropriate temporal information or necessarily specify the relationship among cues leading to operator response. In this paper, we do not attempt to replace standard performance shaping factors commonly used in HRA nor offer a new HRA method, existing methods may suffice. In this paper we strive to enhance current understanding of the basis for operator response through a technique that can be used during the qualitative portion of the HRA analysis process. The CUE map is a means to visualize the relationship among salient cues in the control room that help influence operator response, show how the cognitive map of the operator changes as information is gained or lost, and is applicable to existing as well as advanced hybrid plants and small modular reactor designs. A brief application involving loss of condensate is presented and advantages and limitations of the modeling approach and use of the CUE map are discussed.

  16. Publishing Platform for Aerial Orthophoto Maps, the Complete Stack

    Science.gov (United States)

    Čepický, J.; Čapek, L.

    2016-06-01

    When creating set of orthophoto maps from mosaic compositions, using airborne systems, such as popular drones, we need to publish results of the work to users. Several steps need to be performed in order get large scale raster data published. As first step, data have to be shared as service (OGC WMS as view service, OGC WCS as download service). But for some applications, OGC WMTS is handy as well, for faster view of the data. Finally the data have to become a part of web mapping application, so that they can be used and evaluated by non-technical users. In this talk, we would like to present automated line of those steps, where user puts in orthophoto image and as a result, OGC Open Web Services are published as well as web mapping application with the data. The web mapping application can be used as standard presentation platform for such type of big raster data to generic user. The publishing platform - Geosense online map information system - can be also used for combination of data from various resources and for creating of unique map compositions and as input for better interpretations of photographed phenomenons. The whole process is successfully tested with eBee drone with raster data resolution 1.5-4 cm/px on many areas and result is also used for creation of derived datasets, usually suited for property management - the records of roads, pavements, traffic signs, public lighting, sewage system, grave locations, and others.

  17. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    Science.gov (United States)

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in

  18. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    The purpose of the programme activity on human-centred automation at the HRP is to develop knowledge (in the form of models and theories) and tools (in the form of techniques and simulators) to support design of automation that ensures effective human performance and comprehension. This report presents the work done on both the analytical and experimental side of this project. The analytical work has surveyed common definitions of automation and traditional design principles. A general finding is that human-centred automation usually is defined in terms of what it is not. This is partly due to a lack of adequate models and of human-automation interaction. Another result is a clarification of the consequences of automation, in particular with regard to situation awareness and workload. The experimental work has taken place as an explorative experiment in HAMMLAB in collaboration with IPSN (France). The purpose of this experiment was to increase the understanding of how automation influences operator performance in NPP control rooms. Two different types of automation (extensive and limited) were considered in scenarios having two different degrees of complexity (high and low), and involving diagnostic and procedural tasks. Six licensed NPP crews from the NPP at Loviisa, Finland, participated in the experiment. The dependent variables applied were plant performance, operator performance, self-rated crew performance, situation awareness, workload, and operator trust in the automation. The results from the diagnostic scenarios indicated that operators' judgement of crew efficiency was related to their level of trust in the automation, and further that operators trusted automation least and rated crew performance lowest in situations where crew performance was efficient and vice versa. The results from procedural scenarios indicated that extensive automation efficiently supported operators' performance, and further that operator' judgement of crew performance efficiency

  19. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  20. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  1. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  2. SEMI-AUTOMATED APPROACH FOR MAPPING URBAN TREES FROM INTEGRATED AERIAL LiDAR POINT CLOUD AND DIGITAL IMAGERY DATASETS

    Directory of Open Access Journals (Sweden)

    M. A. Dogon-Yaro

    2016-09-01

    Full Text Available Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  3. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  4. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  5. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  6. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  7. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  8. Retrieval Algorithms for Road Surface Modelling Using Laser-Based Mobile Mapping

    Directory of Open Access Journals (Sweden)

    Antero Kukko

    2008-09-01

    Full Text Available Automated processing of the data provided by a laser-based mobile mapping system will be a necessity due to the huge amount of data produced. In the future, vehiclebased laser scanning, here called mobile mapping, should see considerable use for road environment modelling. Since the geometry of the scanning and point density is different from airborne laser scanning, new algorithms are needed for information extraction. In this paper, we propose automatic methods for classifying the road marking and kerbstone points and modelling the road surface as a triangulated irregular network. On the basis of experimental tests, the mean classification accuracies obtained using automatic method for lines, zebra crossings and kerbstones were 80.6%, 92.3% and 79.7%, respectively.

  9. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  10. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  11. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  12. Mapping land cover through time with the Rapid Land Cover Mapper—Documentation and user manual

    Science.gov (United States)

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2017-02-15

    The Rapid Land Cover Mapper is an Esri ArcGIS® Desktop add-in, which was created as an alternative to automated or semiautomated mapping methods. Based on a manual photo interpretation technique, the tool facilitates mapping over large areas and through time, and produces time-series raster maps and associated statistics that characterize the changing landscapes. The Rapid Land Cover Mapper add-in can be used with any imagery source to map various themes (for instance, land cover, soils, or forest) at any chosen mapping resolution. The user manual contains all essential information for the user to make full use of the Rapid Land Cover Mapper add-in. This manual includes a description of the add-in functions and capabilities, and step-by-step procedures for using the add-in. The Rapid Land Cover Mapper add-in was successfully used by the U.S. Geological Survey West Africa Land Use Dynamics team to accurately map land use and land cover in 17 West African countries through time (1975, 2000, and 2013).

  13. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  14. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  15. TU-D-209-03: Alignment of the Patient Graphic Model Using Fluoroscopic Images for Skin Dose Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Oines, A; Oines, A; Kilian-Meneghin, J; Karthikeyan, B; Rudin, S; Bednarek, D [University at Buffalo (SUNY) School of Med., Buffalo, NY (United States)

    2016-06-15

    Purpose: The Dose Tracking System (DTS) was developed to provide realtime feedback of skin dose and dose rate during interventional fluoroscopic procedures. A color map on a 3D graphic of the patient represents the cumulative dose distribution on the skin. Automated image correlation algorithms are described which use the fluoroscopic procedure images to align and scale the patient graphic for more accurate dose mapping. Methods: Currently, the DTS employs manual patient graphic selection and alignment. To improve the accuracy of dose mapping and automate the software, various methods are explored to extract information about the beam location and patient morphology from the procedure images. To match patient anatomy with a reference projection image, preprocessing is first used, including edge enhancement, edge detection, and contour detection. Template matching algorithms from OpenCV are then employed to find the location of the beam. Once a match is found, the reference graphic is scaled and rotated to fit the patient, using image registration correlation functions in Matlab. The algorithm runs correlation functions for all points and maps all correlation confidences to a surface map. The highest point of correlation is used for alignment and scaling. The transformation data is saved for later model scaling. Results: Anatomic recognition is used to find matching features between model and image and image registration correlation provides for alignment and scaling at any rotation angle with less than onesecond runtime, and at noise levels in excess of 150% of those found in normal procedures. Conclusion: The algorithm provides the necessary scaling and alignment tools to improve the accuracy of dose distribution mapping on the patient graphic with the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.

  16. TU-D-209-03: Alignment of the Patient Graphic Model Using Fluoroscopic Images for Skin Dose Mapping

    International Nuclear Information System (INIS)

    Oines, A; Oines, A; Kilian-Meneghin, J; Karthikeyan, B; Rudin, S; Bednarek, D

    2016-01-01

    Purpose: The Dose Tracking System (DTS) was developed to provide realtime feedback of skin dose and dose rate during interventional fluoroscopic procedures. A color map on a 3D graphic of the patient represents the cumulative dose distribution on the skin. Automated image correlation algorithms are described which use the fluoroscopic procedure images to align and scale the patient graphic for more accurate dose mapping. Methods: Currently, the DTS employs manual patient graphic selection and alignment. To improve the accuracy of dose mapping and automate the software, various methods are explored to extract information about the beam location and patient morphology from the procedure images. To match patient anatomy with a reference projection image, preprocessing is first used, including edge enhancement, edge detection, and contour detection. Template matching algorithms from OpenCV are then employed to find the location of the beam. Once a match is found, the reference graphic is scaled and rotated to fit the patient, using image registration correlation functions in Matlab. The algorithm runs correlation functions for all points and maps all correlation confidences to a surface map. The highest point of correlation is used for alignment and scaling. The transformation data is saved for later model scaling. Results: Anatomic recognition is used to find matching features between model and image and image registration correlation provides for alignment and scaling at any rotation angle with less than onesecond runtime, and at noise levels in excess of 150% of those found in normal procedures. Conclusion: The algorithm provides the necessary scaling and alignment tools to improve the accuracy of dose distribution mapping on the patient graphic with the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.

  17. NASA Goddard Space Flight Center Robotic Processing System Program Automation Systems, volume 2

    Science.gov (United States)

    Dobbs, M. E.

    1991-01-01

    Topics related to robot operated materials processing in space (RoMPS) are presented in view graph form. Some of the areas covered include: (1) mission requirements; (2) automation management system; (3) Space Transportation System (STS) Hitchhicker Payload; (4) Spacecraft Command Language (SCL) scripts; (5) SCL software components; (6) RoMPS EasyLab Command & Variable summary for rack stations and annealer module; (7) support electronics assembly; (8) SCL uplink packet definition; (9) SC-4 EasyLab System Memory Map; (10) Servo Axis Control Logic Suppliers; and (11) annealing oven control subsystem.

  18. Physiological Self-Regulation and Adaptive Automation

    Science.gov (United States)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  19. Diagnosis of regional cerebral blood flow abnormalities using SPECT: agreement between individualized statistical parametric maps and visual inspection by nuclear medicine physicians with different levels of expertise in nuclear neurology

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Euclides Timoteo da, E-mail: euclidestimoteo@uol.com.b [Fundacao Pio XII, Barretos, SP (Brazil). Hospital de Cancer. Dept. de Medicina Nuclear; Buchpiguel, Carlos Alberto [Hospital do Coracao, Sao Paulo, SP (Brazil). Dept. de Medicina Nuclear; Nitrini, Ricardo [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina. Dept. de Neurologia; Tazima, Sergio [Hospital Alemao Oswaldo Cruz (HAOC), Sao Paulo, SP (Brazil). Dept. de Medicina Nuclear; Peres, Stela Verzinhase [Fundacao Pio XII, Barretos, SP (Brazil). Hospital de Cancer; Busatto Filho, Geraldo [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina. Div. de Medicina Nuclear

    2009-07-01

    Introduction: visual analysis is widely used to interpret regional cerebral blood flow (rCBF) SPECT images in clinical practice despite its limitations. Automated methods are employed to investigate between-group rCBF differences in research studies but have rarely been explored in individual analyses. Objectives: to compare visual inspection by nuclear physicians with the automated statistical parametric mapping program using a SPECT dataset of patients with neurological disorders and normal control images. Methods: using statistical parametric mapping, 14 SPECT images from patients with various neurological disorders were compared individually with a databank of 32 normal images using a statistical threshold of p<0.05 (corrected for multiple comparisons at the level of individual voxels or clusters). Statistical parametric mapping results were compared with visual analyses by a nuclear physician highly experienced in neurology (A) as well as a nuclear physician with a general background of experience (B) who independently classified images as normal or altered, and determined the location of changes and the severity. Results: of the 32 images of the normal databank, 4 generated maps showing rCBF abnormalities (p<0.05, corrected). Among the 14 images from patients with neurological disorders, 13 showed rCBF alterations. Statistical parametric mapping and physician A completely agreed on 84.37% and 64.28% of cases from the normal databank and neurological disorders, respectively. The agreement between statistical parametric mapping and ratings of physician B were lower (71.18% and 35.71%, respectively). Conclusion: statistical parametric mapping replicated the findings described by the more experienced nuclear physician. This finding suggests that automated methods for individually analyzing rCBF SPECT images may be a valuable resource to complement visual inspection in clinical practice. (author)

  20. QuickRNASeq lifts large-scale RNA-seq data analyses to the next level of automation and interactive visualization.

    Science.gov (United States)

    Zhao, Shanrong; Xi, Li; Quan, Jie; Xi, Hualin; Zhang, Ying; von Schack, David; Vincent, Michael; Zhang, Baohong

    2016-01-08

    RNA sequencing (RNA-seq), a next-generation sequencing technique for transcriptome profiling, is being increasingly used, in part driven by the decreasing cost of sequencing. Nevertheless, the analysis of the massive amounts of data generated by large-scale RNA-seq remains a challenge. Multiple algorithms pertinent to basic analyses have been developed, and there is an increasing need to automate the use of these tools so as to obtain results in an efficient and user friendly manner. Increased automation and improved visualization of the results will help make the results and findings of the analyses readily available to experimental scientists. By combing the best open source tools developed for RNA-seq data analyses and the most advanced web 2.0 technologies, we have implemented QuickRNASeq, a pipeline for large-scale RNA-seq data analyses and visualization. The QuickRNASeq workflow consists of three main steps. In Step #1, each individual sample is processed, including mapping RNA-seq reads to a reference genome, counting the numbers of mapped reads, quality control of the aligned reads, and SNP (single nucleotide polymorphism) calling. Step #1 is computationally intensive, and can be processed in parallel. In Step #2, the results from individual samples are merged, and an integrated and interactive project report is generated. All analyses results in the report are accessible via a single HTML entry webpage. Step #3 is the data interpretation and presentation step. The rich visualization features implemented here allow end users to interactively explore the results of RNA-seq data analyses, and to gain more insights into RNA-seq datasets. In addition, we used a real world dataset to demonstrate the simplicity and efficiency of QuickRNASeq in RNA-seq data analyses and interactive visualizations. The seamless integration of automated capabilites with interactive visualizations in QuickRNASeq is not available in other published RNA-seq pipelines. The high degree

  1. Modeling of prepregs during automated draping sequences

    Science.gov (United States)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  2. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  3. A reliable spatially normalized template of the human spinal cord--Applications to automated white matter/gray matter segmentation and tensor-based morphometry (TBM) mapping of gray matter alterations occurring with age.

    Science.gov (United States)

    Taso, Manuel; Le Troter, Arnaud; Sdika, Michaël; Cohen-Adad, Julien; Arnoux, Pierre-Jean; Guye, Maxime; Ranjeva, Jean-Philippe; Callot, Virginie

    2015-08-15

    Recently, a T2*-weighted template and probabilistic atlas of the white and gray matter (WM, GM) of the spinal cord (SC) have been reported. Such template can be used as tissue-priors for automated WM/GM segmentation but can also provide a common reference and normalized space for group studies. Here, a new template has been created (AMU40), and accuracy of automatic template-based WM/GM segmentation was quantified. The feasibility of tensor-based morphometry (TBM) for studying voxel-wise morphological differences of SC between young and elderly healthy volunteers was also investigated. Sixty-five healthy subjects were divided into young (n=40, age50years old, mean age 57±5years old) groups and scanned at 3T using an axial high-resolution T2*-weighted sequence. Inhomogeneity correction and affine intensity normalization of the SC and cerebrospinal fluid (CSF) signal intensities across slices were performed prior to both construction of the AMU40 template and WM/GM template-based segmentation. The segmentation was achieved using non-linear spatial normalization of T2*-w MR images to the AMU40 template. Validation of WM/GM segmentations was performed with a leave-one-out procedure by calculating DICE similarity coefficients between manual and automated WM/GM masks. SC morphological differences between young and elderly healthy volunteers were assessed using the same non-linear spatial normalization of the subjects' MRI to a common template, derivation of the Jacobian determinant maps from the warping fields, and a TBM analysis. Results demonstrated robust WM/GM automated segmentation, with mean DICE values greater than 0.8. Concerning the TBM analysis, an anterior GM atrophy was highlighted in elderly volunteers, demonstrating thereby, for the first time, the feasibility of studying local structural alterations in the SC using tensor-based morphometry. This holds great promise for studies of morphological impairment occurring in several central nervous system

  4. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  5. Flow Mapping in a Gas-Solid Riser via Computer Automated Radioactive Particle Tracking (CARPT)

    Energy Technology Data Exchange (ETDEWEB)

    Muthanna Al-Dahhan; Milorad P. Dudukovic; Satish Bhusarapu; Timothy J. O' hern; Steven Trujillo; Michael R. Prairie

    2005-06-04

    Statement of the Problem: Developing and disseminating a general and experimentally validated model for turbulent multiphase fluid dynamics suitable for engineering design purposes in industrial scale applications of riser reactors and pneumatic conveying, require collecting reliable data on solids trajectories, velocities ? averaged and instantaneous, solids holdup distribution and solids fluxes in the riser as a function of operating conditions. Such data are currently not available on the same system. Multiphase Fluid Dynamics Research Consortium (MFDRC) was established to address these issues on a chosen example of circulating fluidized bed (CFB) reactor, which is widely used in petroleum and chemical industry including coal combustion. This project addresses the problem of lacking reliable data to advance CFB technology. Project Objectives: The objective of this project is to advance the understanding of the solids flow pattern and mixing in a well-developed flow region of a gas-solid riser, operated at different gas flow rates and solids loading using the state-of-the-art non-intrusive measurements. This work creates an insight and reliable database for local solids fluid-dynamic quantities in a pilot-plant scale CFB, which can then be used to validate/develop phenomenological models for the riser. This study also attempts to provide benchmark data for validation of Computational Fluid Dynamic (CFD) codes and their current closures. Technical Approach: Non-Invasive Computer Automated Radioactive Particle Tracking (CARPT) technique provides complete Eulerian solids flow field (time average velocity map and various turbulence parameters such as the Reynolds stresses, turbulent kinetic energy, and eddy diffusivities). It also gives directly the Lagrangian information of solids flow and yields the true solids residence time distribution (RTD). Another radiation based technique, Computed Tomography (CT) yields detailed time averaged local holdup profiles at

  6. FESetup: Automating Setup for Alchemical Free Energy Simulations.

    Science.gov (United States)

    Loeffler, Hannes H; Michel, Julien; Woods, Christopher

    2015-12-28

    FESetup is a new pipeline tool which can be used flexibly within larger workflows. The tool aims to support fast and easy setup of alchemical free energy simulations for molecular simulation packages such as AMBER, GROMACS, Sire, or NAMD. Post-processing methods like MM-PBSA and LIE can be set up as well. Ligands are automatically parametrized with AM1-BCC, and atom mappings for a single topology description are computed with a maximum common substructure search (MCSS) algorithm. An abstract molecular dynamics (MD) engine can be used for equilibration prior to free energy setup or standalone. Currently, all modern AMBER force fields are supported. Ease of use, robustness of the code, and automation where it is feasible are the main development goals. The project follows an open development model, and we welcome contributions.

  7. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  8. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  9. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  10. Capability Assessment and Performance Metrics for the Titan Multispectral Mapping Lidar

    Directory of Open Access Journals (Sweden)

    Juan Carlos Fernandez-Diaz

    2016-11-01

    Full Text Available In this paper we present a description of a new multispectral airborne mapping light detection and ranging (lidar along with performance results obtained from two years of data collection and test campaigns. The Titan multiwave lidar is manufactured by Teledyne Optech Inc. (Toronto, ON, Canada and emits laser pulses in the 1550, 1064 and 532 nm wavelengths simultaneously through a single oscillating mirror scanner at pulse repetition frequencies (PRF that range from 50 to 300 kHz per wavelength (max combined PRF of 900 kHz. The Titan system can perform simultaneous mapping in terrestrial and very shallow water environments and its multispectral capability enables new applications, such as the production of false color active imagery derived from the lidar return intensities and the automated classification of target and land covers. Field tests and mapping projects performed over the past two years demonstrate capabilities to classify five land covers in urban environments with an accuracy of 90%, map bathymetry under more than 15 m of water, and map thick vegetation canopies at sub-meter vertical resolutions. In addition to its multispectral and performance characteristics, the Titan system is designed with several redundancies and diversity schemes that have proven to be beneficial for both operations and the improvement of data quality.

  11. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  12. Automated main-chain model building by template matching and iterative fragment extension.

    Science.gov (United States)

    Terwilliger, Thomas C

    2003-01-01

    An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and beta-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and beta-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C(alpha) positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 A. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition.

  13. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  14. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  15. Forelimb training drives transient map reorganization in ipsilateral motor cortex.

    Science.gov (United States)

    Pruitt, David T; Schmid, Ariel N; Danaphongse, Tanya T; Flanagan, Kate E; Morrison, Robert A; Kilgard, Michael P; Rennaker, Robert L; Hays, Seth A

    2016-10-15

    Skilled motor training results in reorganization of contralateral motor cortex movement representations. The ipsilateral motor cortex is believed to play a role in skilled motor control, but little is known about how training influences reorganization of ipsilateral motor representations of the trained limb. To determine whether training results in reorganization of ipsilateral motor cortex maps, rats were trained to perform the isometric pull task, an automated motor task that requires skilled forelimb use. After either 3 or 6 months of training, intracortical microstimulation (ICMS) mapping was performed to document motor representations of the trained forelimb in the hemisphere ipsilateral to that limb. Motor training for 3 months resulted in a robust expansion of right forelimb representation in the right motor cortex, demonstrating that skilled motor training drives map plasticity ipsilateral to the trained limb. After 6 months of training, the right forelimb representation in the right motor cortex was significantly smaller than the representation observed in rats trained for 3 months and similar to untrained controls, consistent with a normalization of motor cortex maps. Forelimb map area was not correlated with performance on the trained task, suggesting that task performance is maintained despite normalization of cortical maps. This study provides new insights into how the ipsilateral cortex changes in response to skilled learning and may inform rehabilitative strategies to enhance cortical plasticity to support recovery after brain injury. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Creating soil moisture maps based on radar satellite imagery

    Science.gov (United States)

    Hnatushenko, Volodymyr; Garkusha, Igor; Vasyliev, Volodymyr

    2017-10-01

    The presented work is related to a study of mapping soil moisture basing on radar data from Sentinel-1 and a test of adequacy of the models constructed on the basis of data obtained from alternative sources. Radar signals are reflected from the ground differently, depending on its properties. In radar images obtained, for example, in the C band of the electromagnetic spectrum, soils saturated with moisture usually appear in dark tones. Although, at first glance, the problem of constructing moisture maps basing on radar data seems intuitively clear, its implementation on the basis of the Sentinel-1 data on an industrial scale and in the public domain is not yet available. In the process of mapping, for verification of the results, measurements of soil moisture obtained from logs of the network of climate stations NOAA US Climate Reference Network (USCRN) were used. This network covers almost the entire territory of the United States. The passive microwave radiometers of Aqua and SMAP satellites data are used for comparing processing. In addition, other supplementary cartographic materials were used, such as maps of soil types and ready moisture maps. The paper presents a comparison of the effect of the use of certain methods of roughening the quality of radar data on the result of mapping moisture. Regression models were constructed showing dependence of backscatter coefficient values Sigma0 for calibrated radar data of different spatial resolution obtained at different times on soil moisture values. The obtained soil moisture maps of the territories of research, as well as the conceptual solutions about automation of operations of constructing such digital maps, are presented. The comparative assessment of the time required for processing a given set of radar scenes with the developed tools and with the ESA SNAP product was carried out.

  17. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  18. Automation of Electrical Cable Harnesses Testing

    Directory of Open Access Journals (Sweden)

    Zhuming Bi

    2017-12-01

    Full Text Available Traditional automated systems, such as industrial robots, are applied in well-structured environments, and many automated systems have a limited adaptability to deal with complexity and uncertainty; therefore, the applications of industrial robots in small- and medium-sized enterprises (SMEs are very limited. The majority of manual operations in SMEs are too complicated for automation. The rapidly developed information technologies (IT has brought new opportunities for the automation of manufacturing and assembly processes in the ill-structured environments. Note that an automation solution should be designed to meet the given requirements of the specified application, and it differs from one application to another. In this paper, we look into the feasibility of automated testing for electric cable harnesses, and our focus is on some of the generic strategies for the improvement of the adaptability of automation solutions. Especially, the concept of modularization is adopted in developing hardware and software to maximize system adaptability in testing a wide scope of products. A proposed system has been implemented, and the system performances have been evaluated by executing tests on actual products. The testing experiments have shown that the automated system outperformed manual operations greatly in terms of cost-saving, productivity and reliability. Due to the potential of increasing system adaptability and cost reduction, the presented work has its theoretical and practical significance for an extension for other automation solutions in SMEs.

  19. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  20. An automated 3D reconstruction method of UAV images

    Science.gov (United States)

    Liu, Jun; Wang, He; Liu, Xiaoyang; Li, Feng; Sun, Guangtong; Song, Ping

    2015-10-01

    In this paper a novel fully automated 3D reconstruction approach based on low-altitude unmanned aerial vehicle system (UAVs) images will be presented, which does not require previous camera calibration or any other external prior knowledge. Dense 3D point clouds are generated by integrating orderly feature extraction, image matching, structure from motion (SfM) and multi-view stereo (MVS) algorithms, overcoming many of the cost, time limitations of rigorous photogrammetry techniques. An image topology analysis strategy is introduced to speed up large scene reconstruction by taking advantage of the flight-control data acquired by UAV. Image topology map can significantly reduce the running time of feature matching by limiting the combination of images. A high-resolution digital surface model of the study area is produced base on UAV point clouds by constructing the triangular irregular network. Experimental results show that the proposed approach is robust and feasible for automatic 3D reconstruction of low-altitude UAV images, and has great potential for the acquisition of spatial information at large scales mapping, especially suitable for rapid response and precise modelling in disaster emergency.

  1. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  2. Automation process for morphometric analysis of volumetric CT data from pulmonary vasculature in rats.

    Science.gov (United States)

    Shingrani, Rahul; Krenz, Gary; Molthen, Robert

    2010-01-01

    With advances in medical imaging scanners, it has become commonplace to generate large multidimensional datasets. These datasets require tools for a rapid, thorough analysis. To address this need, we have developed an automated algorithm for morphometric analysis incorporating A Visualization Workshop computational and image processing libraries for three-dimensional segmentation, vascular tree generation and structural hierarchical ordering with a two-stage numeric optimization procedure for estimating vessel diameters. We combine this new technique with our mathematical models of pulmonary vascular morphology to quantify structural and functional attributes of lung arterial trees. Our physiological studies require repeated measurements of vascular structure to determine differences in vessel biomechanical properties between animal models of pulmonary disease. Automation provides many advantages including significantly improved speed and minimized operator interaction and biasing. The results are validated by comparison with previously published rat pulmonary arterial micro-CT data analysis techniques, in which vessels were manually mapped and measured using intense operator intervention. Published by Elsevier Ireland Ltd.

  3. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  4. Mapping Cortical Laminar Structure in the 3D BigBrain.

    Science.gov (United States)

    Wagstyl, Konrad; Lepage, Claude; Bludau, Sebastian; Zilles, Karl; Fletcher, Paul C; Amunts, Katrin; Evans, Alan C

    2018-07-01

    Histological sections offer high spatial resolution to examine laminar architecture of the human cerebral cortex; however, they are restricted by being 2D, hence only regions with sufficiently optimal cutting planes can be analyzed. Conversely, noninvasive neuroimaging approaches are whole brain but have relatively low resolution. Consequently, correct 3D cross-cortical patterns of laminar architecture have never been mapped in histological sections. We developed an automated technique to identify and analyze laminar structure within the high-resolution 3D histological BigBrain. We extracted white matter and pial surfaces, from which we derived histologically verified surfaces at the layer I/II boundary and within layer IV. Layer IV depth was strongly predicted by cortical curvature but varied between areas. This fully automated 3D laminar analysis is an important requirement for bridging high-resolution 2D cytoarchitecture and in vivo 3D neuroimaging. It lays the foundation for in-depth, whole-brain analyses of cortical layering.

  5. Ask the experts: automation: part I.

    Science.gov (United States)

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  6. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  7. Failure detection in high-performance clusters and computers using chaotic map computations

    Science.gov (United States)

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  8. Demands on digital automation; Anforderungen an die Digitale Automation

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, P.

    1995-12-31

    In chapter 12 of the anthology about building control the demands on digital automation are presented. The following aspects are discussed: variety of the companies` philosophies, demands of the customer/investor, demands of the use of buildings/rooms, the user, point of view of manufacturer of technical plants. (BWI) [Deutsch] Kapitel 12 des Sammelbandes ueber Building Control stellt die Anforderungen an die Digitale Automation vor. In diesem Zusammenhang wird auf folgende Themenbereiche eingegangen: Spektrum der Firmenphilosophien, Forderungen der Auftraggeber/Investoren, der Gebaeude-/Raumnutzung, der Betreiber sowie Sicht der Ersteller betriebstechnischer Anlagen. (BWI)

  9. Generating Topographic Map Data from Classification Results

    Directory of Open Access Journals (Sweden)

    Joachim Höhle

    2017-03-01

    Full Text Available The use of classification results as topographic map data requires cartographic enhancement and checking of the geometric accuracy. Urban areas are of special interest. The conversion of the classification result into topographic map data of high thematic and geometric quality is subject of this contribution. After reviewing the existing literature on this topic, a methodology is presented. The extraction of point clouds belonging to line segments is solved by the Hough transform. The mathematics for deriving polygons of orthogonal, parallel and general line segments by least squares adjustment is presented. A unique solution for polylines, where the Hough parameters are optimized, is also given. By means of two data sets land cover maps of six classes were produced and then enhanced by the proposed method. The classification used the decision tree method applying a variety of attributes including object heights derived from imagery. The cartographic enhancement is carried out with two different levels of quality. The user’s accuracies for the classes “impervious surface” and “building” were above 85% in the “Level 1” map of Example 1. The geometric accuracy of building corners at the “Level 2” maps is assessed by means of reference data derived from ortho-images. The obtained root mean square errors (RMSE of the generated coordinates (x, y were RMSEx = 1.2 m and RMSEy = 0.7 m (Example 1 and RMSEx = 0.8 m and RMSEy = 1.0 m (Example 2 using 31 and 62 check points, respectively. All processing for Level 1 (raster data could be carried out with a high degree of automation. Level 2 maps (vector data were compiled for the classes “building” and “road and parking lot”. For urban areas with numerous classes and of large size, universal algorithms are necessary to produce vector data fully automatically. The recent progress in sensors and machine learning methods will support the generation of topographic map data of high

  10. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  11. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    Science.gov (United States)

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  12. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  13. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  14. Hybrid geomorphological maps as the basis for assessing geoconservation potential in Lech, Vorarlberg (Austria)

    Science.gov (United States)

    Seijmonsbergen, Harry; de Jong, Mat; Anders, Niels; de Graaff, Leo; Cammeraat, Erik

    2013-04-01

    Geoconservation potential is, in our approach, closely linked to the spatial distribution of geomorphological sites and thus, geomorphological inventories. Detailed geomorphological maps are translated, using a standardized workflow, into polygonal maps showing the potential geoconservation value of landforms. A new development is to semi-automatically extract in a GIS geomorphological information from high resolution topographical data, such as LiDAR, and combine this with conventional data types (e.g. airphotos, geological maps) into geomorphological maps. Such hybrid digital geomorphological maps are also easily translated into digital information layers which show the geoconservation potential in an area. We present a protocol for digital geomorphological mapping illustrated with an example for the municipality of Lech in Vorarlberg (Austria). The protocol consists of 5 steps: 1. data preparation, 2. generating training and validation samples, 3. parameterization, 4. feature extraction, and 5. assessing classification accuracy. The resulting semi-automated digital geomorphological map is then further validated, in two ways. Firstly, the map is manually checked with the help of a series of digital datasets (e.g. airphotos) in a digital 3D environment, such as ArcScene. The second validation is field visit, which preferably occurs in parallel to the digital evaluation, so that updates are quickly achieved. The final digital and coded geomorphological information layer is converted into a potential geoconservation map by weighting and ranking the landforms based on four criteria: scientific relevance, frequency of occurrence, disturbance, and environmental vulnerability. The criteria with predefined scores for the various landform types are stored in a separate GIS attribute table, which is joined to the attribute table of the hybrid geomorphological information layer in an automated procedure. The results of the assessment can be displayed as the potential

  15. Shape indexes for semi-automated detection of windbreaks in thematic tree cover maps from the central United States

    Science.gov (United States)

    Greg C. Liknes; Dacia M. Meneguzzo; Todd A. Kellerman

    2017-01-01

    Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., <5 m) land cover datasets become more available for these areas, semi-or fully-automated methods for distinguishing...

  16. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  17. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  18. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  19. Future Computer, Communication, Control and Automation

    CERN Document Server

    2011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume topics covered include wireless communications, advances in wireless video, wireless sensors networking, security in wireless networks, network measurement and management, hybrid and discrete-event systems, internet analytics and automation, robotic system and applications, reconfigurable automation systems, machine vision in automation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  20. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  1. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    Science.gov (United States)

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  2. Mapping Urban Land Use at Street Block Level Using OpenStreetMap, Remote Sensing Data, and Spatial Metrics

    Directory of Open Access Journals (Sweden)

    Taïs Grippa

    2018-06-01

    Full Text Available Up-to-date and reliable land-use information is essential for a variety of applications such as planning or monitoring of the urban environment. This research presents a workflow for mapping urban land use at the street block level, with a focus on residential use, using very-high resolution satellite imagery and derived land-cover maps as input. We develop a processing chain for the automated creation of street block polygons from OpenStreetMap and ancillary data. Spatial metrics and other street block features are computed, followed by feature selection that reduces the initial datasets by more than 80%, providing a parsimonious, discriminative, and redundancy-free set of features. A random forest (RF classifier is used for the classification of street blocks, which results in accuracies of 84% and 79% for five and six land-use classes, respectively. We exploit the probabilistic output of RF to identify and relabel blocks that have a high degree of uncertainty. Finally, the thematic precision of the residential blocks is refined according to the proportion of the built-up area. The output data and processing chains are made freely available. The proposed framework is able to process large datasets, given that the cities in the case studies, Dakar and Ouagadougou, cover more than 1000 km2 in total, with a spatial resolution of 0.5 m.

  3. Three main paradigms of simultaneous localization and mapping (SLAM) problem

    Science.gov (United States)

    Imani, Vandad; Haataja, Keijo; Toivanen, Pekka

    2018-04-01

    Simultaneous Localization and Mapping (SLAM) is one of the most challenging research areas within computer and machine vision for automated scene commentary and explanation. The SLAM technique has been a developing research area in the robotics context during recent years. By utilizing the SLAM method robot can estimate the different positions of the robot at the distinct points of time which can indicate the trajectory of robot as well as generate a map of the environment. SLAM has unique traits which are estimating the location of robot and building a map in the various types of environment. SLAM is effective in different types of environment such as indoor, outdoor district, Air, Underwater, Underground and Space. Several approaches have been investigated to use SLAM technique in distinct environments. The purpose of this paper is to provide an accurate perceptive review of case history of SLAM relied on laser/ultrasonic sensors and camera as perception input data. In addition, we mainly focus on three paradigms of SLAM problem with all its pros and cons. In the future, use intelligent methods and some new idea will be used on visual SLAM to estimate the motion intelligent underwater robot and building a feature map of marine environment.

  4. USGS "Did You Feel It?" internet-based macroseismic intensity maps

    Science.gov (United States)

    Wald, D.J.; Quitoriano, V.; Worden, B.; Hopper, M.; Dewey, J.W.

    2011-01-01

    The U.S. Geological Survey (USGS) "Did You Feel It?" (DYFI) system is an automated approach for rapidly collecting macroseismic intensity data from Internet users' shaking and damage reports and generating intensity maps immediately following earthquakes; it has been operating for over a decade (1999-2011). DYFI-based intensity maps made rapidly available through the DYFI system fundamentally depart from more traditional maps made available in the past. The maps are made more quickly, provide more complete coverage and higher resolution, provide for citizen input and interaction, and allow data collection at rates and quantities never before considered. These aspects of Internet data collection, in turn, allow for data analyses, graphics, and ways to communicate with the public, opportunities not possible with traditional data-collection approaches. Yet web-based contributions also pose considerable challenges, as discussed herein. After a decade of operational experience with the DYFI system and users, we document refinements to the processing and algorithmic procedures since DYFI was first conceived. We also describe a number of automatic post-processing tools, operations, applications, and research directions, all of which utilize the extensive DYFI intensity datasets now gathered in near-real time. DYFI can be found online at the website http://earthquake.usgs.gov/dyfi/. ?? 2011 by the Istituto Nazionale di Geofisica e Vulcanologia.

  5. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  6. GUI test automation for Qt application

    OpenAIRE

    Wang, Lei

    2015-01-01

    GUI test automation is a popular and interesting subject in the testing industry. Many companies plan to start test automation projects in order to implement efficient, less expensive software testing. However, there are challenges for the testing team who lack experience performing GUI tests automation. Many GUI test automation projects have ended in failure due to mistakes made during the early stages of the project. The major work of this thesis is to find a solution to the challenges of e...

  7. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  8. 76 FR 69755 - National Customs Automation Program Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation... announces U.S. Customs and Border Protection's (CBP's) plan to conduct a National Customs Automation Program... conveyance transporting the cargo to the United States. This data will fulfill merchandise entry requirements...

  9. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    Science.gov (United States)

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  10. Multimodal Examination of Atrial Fibrillation Substrate: Correlation of Left Atrial Bipolar Voltage Using Multi-Electrode Fast Automated Mapping, Point-by-Point Mapping, and Magnetic Resonance Image Intensity Ratio.

    Science.gov (United States)

    Zghaib, Tarek; Keramati, Ali; Chrispin, Jonathan; Huang, Dong; Balouch, Muhammad A; Ciuffo, Luisa; Berger, Ronald D; Marine, Joseph E; Ashikaga, Hiroshi; Calkins, Hugh; Nazarian, Saman; Spragg, David D

    2018-01-01

    Bipolar voltage mapping, as part of atrial fibrillation (AF) ablation, is traditionally performed in a point-by-point (PBP) approach using single-tip ablation catheters. Alternative techniques for fibrosis-delineation include fast-anatomical mapping (FAM) with multi-electrode circular catheters, and late gadolinium-enhanced magnetic-resonance imaging (LGE-MRI). The correlation between PBP, FAM, and LGE-MRI fibrosis assessment is unknown. In this study, we examined AF substrate using different modalities (PBP, FAM, and LGE-MRI mapping) in patients presenting for an AF ablation. LGE-MRI was performed pre-ablation in 26 patients (73% males, age 63±8years). Local image-intensity ratio (IIR) was used to normalize myocardial intensities. PBP- and FAM-voltage maps were acquired, in sinus rhythm, prior to ablation and co-registered to LGE-MRI. Mean bipolar voltage for all 19,087 FAM voltage points was 0.88±1.27mV and average IIR was 1.08±0.18. In an adjusted mixed-effects model, each unit increase in local IIR was associated with 57% decrease in bipolar voltage (p0.74 corresponded to bipolar voltage voltage was significantly associated with log-PBP bipolar voltage (ß=0.36, pvoltages, FAM-mapping distribution was shifted to the left compared to PBP-mapping; at intermediate voltages, FAM and PBP voltages were overlapping; and at high voltages, FAM exceeded PBP-voltages. LGE-MRI, FAM and PBP-mapping show good correlation in delineating electro-anatomical AF substrate. Each approach has fundamental technical characteristics, the awareness of which allows proper assessment of atrial fibrosis.

  11. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    Science.gov (United States)

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  12. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  13. Evaluation of an Automated Keywording System.

    Science.gov (United States)

    Malone, Linda C.; And Others

    1990-01-01

    Discussion of automated indexing techniques focuses on ways to statistically document improvements in the development of an automated keywording system over time. The system developed by the Joint Chiefs of Staff to automate the storage, categorization, and retrieval of information from military exercises is explained, and performance measures are…

  14. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  15. Automation for a base station stability testing

    OpenAIRE

    Punnek, Elvis

    2016-01-01

    This Batchelor’s thesis was commissioned by Oy LM Ericsson Ab Oulu. The aim of it was to help to investigate and create a test automation solution for the stability testing of the LTE base station. The main objective was to create a test automation for a predefined test set. This test automation solution had to be created for specific environments and equipment. This work included creating the automation for the test cases and putting them to daily test automation jobs. The key factor...

  16. Prototype semantic infrastructure for automated small molecule classification and annotation in lipidomics.

    Science.gov (United States)

    Chepelev, Leonid L; Riazanov, Alexandre; Kouznetsov, Alexandre; Low, Hong Sang; Dumontier, Michel; Baker, Christopher J O

    2011-07-26

    The development of high-throughput experimentation has led to astronomical growth in biologically relevant lipids and lipid derivatives identified, screened, and deposited in numerous online databases. Unfortunately, efforts to annotate, classify, and analyze these chemical entities have largely remained in the hands of human curators using manual or semi-automated protocols, leaving many novel entities unclassified. Since chemical function is often closely linked to structure, accurate structure-based classification and annotation of chemical entities is imperative to understanding their functionality. As part of an exploratory study, we have investigated the utility of semantic web technologies in automated chemical classification and annotation of lipids. Our prototype framework consists of two components: an ontology and a set of federated web services that operate upon it. The formal lipid ontology we use here extends a part of the LiPrO ontology and draws on the lipid hierarchy in the LIPID MAPS database, as well as literature-derived knowledge. The federated semantic web services that operate upon this ontology are deployed within the Semantic Annotation, Discovery, and Integration (SADI) framework. Structure-based lipid classification is enacted by two core services. Firstly, a structural annotation service detects and enumerates relevant functional groups for a specified chemical structure. A second service reasons over lipid ontology class descriptions using the attributes obtained from the annotation service and identifies the appropriate lipid classification. We extend the utility of these core services by combining them with additional SADI services that retrieve associations between lipids and proteins and identify publications related to specified lipid types. We analyze the performance of SADI-enabled eicosanoid classification relative to the LIPID MAPS classification and reflect on the contribution of our integrative methodology in the context of

  17. Prototype semantic infrastructure for automated small molecule classification and annotation in lipidomics

    Directory of Open Access Journals (Sweden)

    Dumontier Michel

    2011-07-01

    Full Text Available Abstract Background The development of high-throughput experimentation has led to astronomical growth in biologically relevant lipids and lipid derivatives identified, screened, and deposited in numerous online databases. Unfortunately, efforts to annotate, classify, and analyze these chemical entities have largely remained in the hands of human curators using manual or semi-automated protocols, leaving many novel entities unclassified. Since chemical function is often closely linked to structure, accurate structure-based classification and annotation of chemical entities is imperative to understanding their functionality. Results As part of an exploratory study, we have investigated the utility of semantic web technologies in automated chemical classification and annotation of lipids. Our prototype framework consists of two components: an ontology and a set of federated web services that operate upon it. The formal lipid ontology we use here extends a part of the LiPrO ontology and draws on the lipid hierarchy in the LIPID MAPS database, as well as literature-derived knowledge. The federated semantic web services that operate upon this ontology are deployed within the Semantic Annotation, Discovery, and Integration (SADI framework. Structure-based lipid classification is enacted by two core services. Firstly, a structural annotation service detects and enumerates relevant functional groups for a specified chemical structure. A second service reasons over lipid ontology class descriptions using the attributes obtained from the annotation service and identifies the appropriate lipid classification. We extend the utility of these core services by combining them with additional SADI services that retrieve associations between lipids and proteins and identify publications related to specified lipid types. We analyze the performance of SADI-enabled eicosanoid classification relative to the LIPID MAPS classification and reflect on the contribution of

  18. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  19. ShakeCast: Automating and Improving the Use of ShakeMap for Post-Earthquake Decision- Making and Response

    Science.gov (United States)

    Lin, K.; Wald, D. J.

    2007-12-01

    ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users" facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for emergency managers and responders. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, provides overall information regarding the affected areas. When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. To this end, ShakeCast estimates the potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps showing structures or facilities most likely impacted. All ShakeMap and ShakeCast files and products are non-propriety to simplify interfacing with existing users" response tools and to encourage user-made enhancement to the software. ShakeCast uses standard RSS and HTTP requests to communicate with the USGS Web servers that host ShakeMaps, which are widely-distributed and heavily mirrored. The RSS approach allows ShakeCast users to initiate and receive selected ShakeMap products and information on software updates. To assess facility damage estimates, ShakeCast users can combine measured or estimated ground motion parameters with damage relationships that can be pre-computed, use one of these ground motion parameters as input, and produce a multi-state discrete output of damage likelihood. Presently three common approaches are being used to provide users with an

  20. Automatic 3D City Modeling Using a Digital Map and Panoramic Images from a Mobile Mapping System

    Directory of Open Access Journals (Sweden)

    Hyungki Kim

    2014-01-01

    Full Text Available Three-dimensional city models are becoming a valuable resource because of their close geospatial, geometrical, and visual relationship with the physical world. However, ground-oriented applications in virtual reality, 3D navigation, and civil engineering require a novel modeling approach, because the existing large-scale 3D city modeling methods do not provide rich visual information at ground level. This paper proposes a new framework for generating 3D city models that satisfy both the visual and the physical requirements for ground-oriented virtual reality applications. To ensure its usability, the framework must be cost-effective and allow for automated creation. To achieve these goals, we leverage a mobile mapping system that automatically gathers high-resolution images and supplements sensor information such as the position and direction of the captured images. To resolve problems stemming from sensor noise and occlusions, we develop a fusion technique to incorporate digital map data. This paper describes the major processes of the overall framework and the proposed techniques for each step and presents experimental results from a comparison with an existing 3D city model.

  1. Sink detection on tilted terrain for automated identification of glacial cirques

    Science.gov (United States)

    Prasicek, Günther; Robl, Jörg; Lang, Andreas

    2016-04-01

    Glacial cirques are morphologically distinct but complex landforms and represent a vital part of high mountain topography. Their distribution, elevation and relief are expected to hold information on (1) the extent of glacial occupation, (2) the mechanism of glacial cirque erosion, and (3) how glacial in concert with periglacial processes can limit peak altitude and mountain range height. While easily detectably for the expert's eye both in nature and on various representations of topography, their complicated nature makes them a nemesis for computer algorithms. Consequently, manual mapping of glacial cirques is commonplace in many mountain landscapes worldwide, but consistent datasets of cirque distribution and objectively mapped cirques and their morphometrical attributes are lacking. Among the biggest problems for algorithm development are the complexity in shape and the great variability of cirque size. For example, glacial cirques can be rather circular or longitudinal in extent, exist as individual and composite landforms, show prominent topographic depressions or can entirely be filled with water or sediment. For these reasons, attributes like circularity, size, drainage area and topology of landform elements (e.g. a flat floor surrounded by steep walls) have only a limited potential for automated cirque detection. Here we present a novel, geomorphometric method for automated identification of glacial cirques on digital elevation models that exploits their genetic bowl-like shape. First, we differentiate between glacial and fluvial terrain employing an algorithm based on a moving window approach and multi-scale curvature, which is also capable of fitting the analysis window to valley width. We then fit a plane to the valley stretch clipped by the analysis window and rotate the terrain around the center cell until the plane is level. Doing so, we produce sinks of considerable size if the clipped terrain represents a cirque, while no or only very small sinks

  2. AEDs at your fingertips: automated external defibrillators on college campuses and a novel approach for increasing accessibility.

    Science.gov (United States)

    Berger, Ryan J; O'Shea, Jesse G

    2014-01-01

    The use of automated external defibrillators (AEDs) increases survival in cardiac arrest events. Due to the success of previous efforts and free, readily available mobile mapping software, the discussion is to emphasize the importance of the use of AEDs to prevent sudden cardiac arrest-related deaths on college campuses and abroad, while suggesting a novel approach to aiding in access and awareness issues. A user-friendly mobile application (a low-cost iOS map) was developed at Florida State University to decrease AED retrieval distance and time. The development of mobile AED maps is feasible for a variety of universities and other entities, with the potential to save lives. Just having AEDs installed is not enough--they need to be easily locatable. Society increasingly relies on phones to provide information, and there are opportunities to use mobile technology to locate and share information about relevant emergency devices; these should be incorporated into the chain of survival.

  3. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  4. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  5. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Science.gov (United States)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  6. Organizational changes and automation: Towards a customer-oriented automation: Part 3

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely. In this article attention is paid to the necessity of realizing an integrated computerized system, which, however, is not feasible at the moment. The second best alternative is to use various computerized systems, capable of two-way data exchange. Two viable approaches are discussed: (1) one operating system on which all automated systems within a company should run, or (2) a selective system linking on the basis of required speed information exchange. Option (2) offers more freedom of selecting the system. 2 figs

  7. You're a What? Automation Technician

    Science.gov (United States)

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  8. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  9. Fully Automated Deep Learning System for Bone Age Assessment.

    Science.gov (United States)

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-08-01

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  10. System reliability, performance and trust in adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  11. Ningaloo Reef: Shallow Marine Habitats Mapped Using a Hyperspectral Sensor

    Science.gov (United States)

    Kobryn, Halina T.; Wouters, Kristin; Beckley, Lynnath E.; Heege, Thomas

    2013-01-01

    Research, monitoring and management of large marine protected areas require detailed and up-to-date habitat maps. Ningaloo Marine Park (including the Muiron Islands) in north-western Australia (stretching across three degrees of latitude) was mapped to 20 m depth using HyMap airborne hyperspectral imagery (125 bands) at 3.5 m resolution across the 762 km2 of reef environment between the shoreline and reef slope. The imagery was corrected for atmospheric, air-water interface and water column influences to retrieve bottom reflectance and bathymetry using the physics-based Modular Inversion and Processing System. Using field-validated, image-derived spectra from a representative range of cover types, the classification combined a semi-automated, pixel-based approach with fuzzy logic and derivative techniques. Five thematic classification levels for benthic cover (with probability maps) were generated with varying degrees of detail, ranging from a basic one with three classes (biotic, abiotic and mixed) to the most detailed with 46 classes. The latter consisted of all abiotic and biotic seabed components and hard coral growth forms in dominant or mixed states. The overall accuracy of mapping for the most detailed maps was 70% for the highest classification level. Macro-algal communities formed most of the benthic cover, while hard and soft corals represented only about 7% of the mapped area (58.6 km2). Dense tabulate coral was the largest coral mosaic type (37% of all corals) and the rest of the corals were a mix of tabulate, digitate, massive and soft corals. Our results show that for this shallow, fringing reef environment situated in the arid tropics, hyperspectral remote sensing techniques can offer an efficient and cost-effective approach to mapping and monitoring reef habitats over large, remote and inaccessible areas. PMID:23922921

  12. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  13. Applying machine learning to pattern analysis for automated in-design layout optimization

    Science.gov (United States)

    Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh

    2018-04-01

    Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.

  14. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  15. A real-time automated quality control of rain gauge data based on multiple sensors

    Science.gov (United States)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  16. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  17. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  18. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  19. A computational linguistics motivated mapping of ICPC-2 PLUS to SNOMED CT.

    Science.gov (United States)

    Wang, Yefeng; Patrick, Jon; Miller, Graeme; O'Hallaran, Julie

    2008-10-27

    . Automating as much as is possible of this process turns the searching and mapping task into a validation task, which can effectively reduce the cost and increase the efficiency and accuracy of this task over manual methods.

  20. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  1. Context-Aware user interfaces in automation

    DEFF Research Database (Denmark)

    Olsen, Mikkel Holm

    2007-01-01

    Automation is deployed in a great range of different domains such as the chemical industry, the production of consumer goods, the production of energy (both in terms of power plants and in the petrochemical industry), transportation and several others. Through several decades the complexity...... of automation systems and the level of automation have been rising. This has caused problems regarding the operator's ability to comprehend the overall situation and state of the automation system, in particular in abnormal situations. The amount of data available to the operator results in information overload....... Since context-aware applications have been developed in other research areas it seems natural to analyze the findings of this research and examine how this can be applied to the domain of automation systems. By evaluating existing architectures for the development of context-aware applications we find...

  2. Automated transit planning, operation, and applications

    CERN Document Server

    Liu, Rongfang

    2016-01-01

    This book analyzes the successful implementations of automated transit in various international locations, such as Paris, Toronto, London, and Kuala Lumpur, and investigates the apparent lack of automated transit applications in the urban environment in the United States. The book begins with a brief definition of automated transit and its historical development. After a thorough description of the technical specifications, the author highlights a few applications from each sub-group of the automated transit spectrum. International case studies display various technologies and their applications, and identify vital factors that affect each system and performance evaluations of existing applications. The book then discusses the planning and operation of automated transit applications at both macro and micro levels. Finally, the book covers a number of less successful concepts, as well as the lessons learned, allow ng readers to gain a comprehensive understanding of the topic.

  3. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  4. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  5. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  6. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  7. Automated 741 document preparation: Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS)

    International Nuclear Information System (INIS)

    Austin, H.C.; Gray, L.M.

    1982-01-01

    OASIS has been providing for Oak Ridge National Laboratory's total safeguards needs since being place on line in April 1980. The system supports near real-time nuclear materials safeguards and accountability control. The original design of OASIS called for an automated facsimile of a 741 document to be prepared as a functional by-product of updating the inventory. An attempt was made to utilize, intact, DOE-Albuquerque's automated 741 system to generate the facsimile; however, the five page document produced proved too cumbersome. Albuquerque's programs were modified to print an original 741 document utilizing standard DOE/NRC 741 forms. It is felt that the best features of both the automated and manually generated 741 documents have been incorporated. Automation of the source data for 741 shipping documents produces greater efficiency while reducing possible errors. Through utilization of the standard DOE/NRC form, continuity within the NMMSS system is maintained, thus minimizing the confusion and redundancy associated with facsimiles. OASIS now fulfills the original concept of near real-time accountability by furnishing a viable 741 document as a function of updating the inventory

  8. Mapping the Early Language Environment Using All-Day Recordings and Automated Analysis.

    Science.gov (United States)

    Gilkerson, Jill; Richards, Jeffrey A; Warren, Steven F; Montgomery, Judith K; Greenwood, Charles R; Kimbrough Oller, D; Hansen, John H L; Paul, Terrance D

    2017-05-17

    This research provided a first-generation standardization of automated language environment estimates, validated these estimates against standard language assessments, and extended on previous research reporting language behavior differences across socioeconomic groups. Typically developing children between 2 to 48 months of age completed monthly, daylong recordings in their natural language environments over a span of approximately 6-38 months. The resulting data set contained 3,213 12-hr recordings automatically analyzed by using the Language Environment Analysis (LENA) System to generate estimates of (a) the number of adult words in the child's environment, (b) the amount of caregiver-child interaction, and (c) the frequency of child vocal output. Child vocalization frequency and turn-taking increased with age, whereas adult word counts were age independent after early infancy. Child vocalization and conversational turn estimates predicted 7%-16% of the variance observed in child language assessment scores. Lower socioeconomic status (SES) children produced fewer vocalizations, engaged in fewer adult-child interactions, and were exposed to fewer daily adult words compared with their higher socioeconomic status peers, but within-group variability was high. The results offer new insight into the landscape of the early language environment, with clinical implications for identification of children at-risk for impoverished language environments.

  9. Individual differences in the calibration of trust in automation.

    Science.gov (United States)

    Pop, Vlad L; Shrewsbury, Alex; Durso, Francis T

    2015-06-01

    The objective was to determine whether operators with an expectancy that automation is trustworthy are better at calibrating their trust to changes in the capabilities of automation, and if so, why. Studies suggest that individual differences in automation expectancy may be able to account for why changes in the capabilities of automation lead to a substantial change in trust for some, yet only a small change for others. In a baggage screening task, 225 participants searched for weapons in 200 X-ray images of luggage. Participants were assisted by an automated decision aid exhibiting different levels of reliability. Measures of expectancy that automation is trustworthy were used in conjunction with subjective measures of trust and perceived reliability to identify individual differences in trust calibration. Operators with high expectancy that automation is trustworthy were more sensitive to changes (both increases and decreases) in automation reliability. This difference was eliminated by manipulating the causal attribution of automation errors. Attributing the cause of automation errors to factors external to the automation fosters an understanding of tasks and situations in which automation differs in reliability and may lead to more appropriate trust. The development of interventions can lead to calibrated trust in automation. © 2014, Human Factors and Ergonomics Society.

  10. Automatic Stem Mapping by Merging Several Terrestrial Laser Scans at the Feature and Decision Levels

    Directory of Open Access Journals (Sweden)

    Juha Hyyppä

    2013-01-01

    Full Text Available Detailed up-to-date ground reference data have become increasingly important in quantitative forest inventories. Field reference data are conventionally collected at the sample plot level by means of manual measurements, which are both labor-intensive and time-consuming. In addition, the number of attributes collected from the tree stem is limited. More recently, terrestrial laser scanning (TLS, using both single-scan and multi-scan techniques, has proven to be a promising solution for efficient stem mapping at the plot level. In the single-scan method, the laser scanner is placed at the center of the plot, creating only one scan, and all trees are mapped from the single-scan point cloud. Consequently, the occlusion of stems increases as the range of the scanner increases, depending on the forest’s attributes. In the conventional multi-scan method, several scans are made simultaneously inside and outside of the plot to collect point clouds representing all trees within the plot, and these scans are accurately co-registered by using artificial reference targets manually placed throughout the plot. The additional difficulty of applying the multi-scan method is due to the point-cloud registration of several scans not being fully automated yet. This paper proposes a multi-single-scan (MSS method to map the sample plot. The method does not require artificial reference targets placed on the plot or point-level registration. The MSS method is based on the fully automated processing of each scan independently and on the merging of the stem positions automatically detected from multiple scans to accurately map the sample plot. The proposed MSS method was tested on five dense forest plots. The results show that the MSS method significantly improves the stem-detection accuracy compared with the single-scan approach and achieves a mapping accuracy similar to that achieved with the multi-scan method, without the need for the point-level registration.

  11. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  12. Multisource multibeam backscatter data: developing a strategy for the production of benthic habitat maps using semi-automated seafloor classification methods

    Science.gov (United States)

    Lacharité, Myriam; Brown, Craig J.; Gazzola, Vicki

    2018-06-01

    The establishment of multibeam echosounders (MBES) as a mainstream tool in ocean mapping has facilitated integrative approaches towards nautical charting, benthic habitat mapping, and seafloor geotechnical surveys. The bathymetric and backscatter information generated by MBES enables marine scientists to present highly accurate bathymetric data with a spatial resolution closely matching that of terrestrial mapping, and can generate customized thematic seafloor maps to meet multiple ocean management needs. However, when a variety of MBES systems are used, the creation of objective habitat maps can be hindered by the lack of backscatter calibration, due for example, to system-specific settings, yielding relative rather than absolute values. Here, we describe an approach using object-based image analysis to combine 4 non-overlapping and uncalibrated (backscatter) MBES coverages to form a seamless habitat map on St. Anns Bank (Atlantic Canada), a marine protected area hosting a diversity of benthic habitats. The benthoscape map was produced by analysing each coverage independently with supervised classification (k-nearest neighbor) of image-objects based on a common suite of 7 benthoscapes (determined with 4214 ground-truthing photographs at 61 stations, and characterized with backscatter, bathymetry, and bathymetric position index). Manual re-classification based on uncertainty in membership values to individual classes—especially at the boundaries between coverages—was used to build the final benthoscape map. Given the costs and scarcity of MBES surveys in offshore marine ecosystems—particularly in large ecosystems in need of adequate conservation strategies, such as in Canadian waters—developing approaches to synthesize multiple datasets to meet management needs is warranted.

  13. FluidCam 1&2 - UAV-based Fluid Lensing Instruments for High-Resolution 3D Subaqueous Imaging and Automated Remote Biosphere Assessment of Reef Ecosystems

    Science.gov (United States)

    Chirayath, V.; Instrella, R.

    2016-02-01

    We present NASA ESTO FluidCam 1 & 2, Visible and NIR Fluid-Lensing-enabled imaging payloads for Unmanned Aerial Vehicles (UAVs). Developed as part of a focused 2014 earth science technology grant, FluidCam 1&2 are Fluid-Lensing-based computational optical imagers designed for automated 3D mapping and remote sensing of underwater coastal targets from airborne platforms. Fluid Lensing has been used to map underwater reefs in 3D in American Samoa and Hamelin Pool, Australia from UAV platforms at sub-cm scale, which has proven a valuable tool in modern marine research for marine biosphere assessment and conservation. We share FluidCam 1&2 instrument validation and testing results as well as preliminary processed data from field campaigns. Petabyte-scale aerial survey efforts using Fluid Lensing to image at-risk reefs demonstrate broad applicability to large-scale automated species identification, morphology studies and reef ecosystem characterization for shallow marine environments and terrestrial biospheres, of crucial importance to improving bathymetry data for physical oceanographic models and understanding climate change's impact on coastal zones, global oxygen production, carbon sequestration.

  14. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  15. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  16. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  17. Controls and automation in the SPIRAL project

    International Nuclear Information System (INIS)

    Bothner, U.; Boulot, A.; Maherault, J.; Martial, L.

    1999-01-01

    The control and automation team of the R and D of Accelerator-Exotic Beam Department has had in the framework of SPIRAL collaboration the following tasks: 1. automation of the resonator high frequency equipment of the CIME cyclotron; 2. automation of the vacuum equipment, i.e. the low energy line (TBE), the CIME cyclotron, the low energy line (BE); 3. automation of load safety for power supply; 4. for each of these tasks a circuitry file based on the SCHEMA software has been worked out. The programs required in the automation of load safety for power supply (STEP5, PROTOOL, DESIGNER 4.1) were developed and implemented for PC

  18. Automated controlled-potential coulometric determination of uranium

    International Nuclear Information System (INIS)

    Knight, C.H.; Clegg, D.E.; Wright, K.D.; Cassidy, R.M.

    1982-06-01

    A controlled-potential coulometer has been automated in our laboratory for routine determination of uranium in solution. The CRNL-designed automated system controls degassing, prereduction, and reduction of the sample. The final result is displayed on a digital coulometer readout. Manual and automated modes of operation are compared to show the precision and accuracy of the automated system. Results are also shown for the coulometric titration of typical uranium-aluminum alloy samples

  19. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  20. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  1. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  2. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  3. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  4. AN INVESTIGATION OF AUTOMATIC CHANGE DETECTION FOR TOPOGRAPHIC MAP UPDATING

    Directory of Open Access Journals (Sweden)

    P. Duncan

    2012-08-01

    Full Text Available Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI, South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  5. Human-centered automation: Development of a philosophy

    Science.gov (United States)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  6. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  7. An Algorithm Creating Thumbnail for Web Map Services Based on Information Entropy and Trans-scale Similarity

    Directory of Open Access Journals (Sweden)

    CHENG Xiaoqiang

    2017-11-01

    Full Text Available Thumbnail can greatly increase the efficiency of browsing pictures,videos and other image resources and improve the user experience prominently. Map service is a kind of graphic resource coupling spatial information and representation scale,its crafting,retrieval and management will not function well without the support of thumbnail. Sophisticated designed thumbnails bring users vivid first impressions and help users make efficient exploration. On the contrast,coarse thumbnail cause negative emotion and discourage users to explore the map service positively. Inspired by video summarization,key position and key scale of web map service were proposed. Meanwhile,corresponding quantitative measures and an automatic algorithm were drawn up and implemented. With the help of this algorithm,poor visual quality,lack of map information and low automation of current thumbnails was solved successfully. Information entropy was used to determine areas richer in content and tran-scale similarity was calculated to judge at which scale the appearance of the map service has changed drastically,and finally a series of static pictures were extracted which can represent the content of the map service. Experimental results show that this method produced medium-sized,content-rich and well-representative thumbnails which effectively reflect the content and appearance of map service.

  8. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  9. Radioactive contamination mapping system detailed design report

    International Nuclear Information System (INIS)

    Bauer, R.G.; O'Callaghan, P.B.

    1996-08-01

    The Hanford Site's 100 Area production reactors released radioactively and chemically contaminated liquids into the soil column. The primary source of the contaminated liquids was reactor coolant and various waste waters released from planned liquid discharges, as well as pipelines, pipe junctions, and retention basins leaking into the disposal sites. Site remediation involves excavating the contaminated soils using conventional earthmoving techniques and equipment, treating as appropriate, transporting the soils, and disposing the soils at ERDF. To support remediation excavation, disposal, and documentation requirements, an automated radiological monitoring system was deemed necessary. The RCMS (Radioactive Contamination Mapping System) was designed to fulfill this need. This Detailed Design Report provides design information for the RCMS in accordance with Bechtel Hanford, Inc. Engineering Design Project Instructions

  10. Requirements and design concept for a facility mapping system

    International Nuclear Information System (INIS)

    Barry, R.E.; Burks, B.L.; Little, C.Q.

    1995-01-01

    The Department of Energy (DOE) has for some time been considering the Decontamination and Dismantlement (D ampersand D) of facilities which are no longer in use, but which are highly contaminated with radioactive wastes. One of the holdups in performing the D ampersand D task is the accumulation of accurate facility characterizations that can enable a safe and orderly cleanup process. According to the Technical Strategic Plan for the Decontamination and Decommissioning Integrated Demonstration, open-quotes the cost of characterization using current baseline technologies for approximately 100 acres of gaseous diffusion plant at Oak Ridge alone is, for the most part incalculableclose quotes. Automated, robotic techniques will be necessary for initial characterization and continued surveillance of these types of sites. Robotic systems are being designed and constructed to accomplish these tasks. This paper describes requirements and design concepts for a system to accurately map a facility contaminated with hazardous wastes. Some of the technologies involved in the Facility Mapping System are: remote characterization with teleoperated, sensor-based systems, fusion of data sets from multiple characterization systems, and object recognition from 3D data models. This Facility Mapping System is being assembled by Oak Ridge National Laboratory for the DOE Office of Technology Development Robotics Technology Development Program

  11. Mapping the Gaps: Building a pipeline for contributing and accessing crowdsourced bathymetry data

    Science.gov (United States)

    Rosenberg, A. M.; Jencks, J. H.; Robertson, E.; Reed, A.

    2017-12-01

    Both the Moon and Mars have been more comprehensively mapped than the Earth's oceans. Notably, less than 15% of the world's deep ocean and 50% of the world's coastal waters (infrastructure and interface of the DCDB to provide archiving, discovery, display and retrieval of CSB contributed from mariners around the world. NCEI, in partnership with NOAA's Office of Coast Survey and Rose Point Navigation Systems, established a citizen science pilot program in 2015 to harvest CSB from Electronic Navigation Systems. Today, data providers can submit xyz, csv, or geoJSON for automated ingest, while other formats can be accommodated with minimal system code changes. Like most marine geophysical datasets at NCEI, users can discover, filter, and request CSB data via a map viewer (https://maps.ngdc.noaa.gov/viewers/csb/). Now that the CSB pipeline has been established, NCEI has begun to plan future work that includes expanding the current infrastructure to account for increasing data volumes and implementing a point storage technology that would allow results to be dynamically generated and displayed through heat maps, while continuing to increase the number of data contributors to the IHO CSB initiative.

  12. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  13. Wireless Android Based Home Automation System

    Directory of Open Access Journals (Sweden)

    Muhammad Tanveer Riaz

    2017-01-01

    Full Text Available This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network or remotely (internet manage and control the system. Second part is the hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of the available home automation system in the market, the proposed system is scalable that one server can manage many hardware interface modules as long as it exists within network coverage. System supports a wide range of home automation devices like appliances, power management components, and security components. The proposed system is better in terms of the flexibility and scalability than the commercially available home automation systems

  14. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  15. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  16. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  17. Mapping social behavior-induced brain activation at cellular resolution in the mouse

    Science.gov (United States)

    Kim, Yongsoo; Venkataraju, Kannan Umadevi; Pradhan, Kith; Mende, Carolin; Taranda, Julian; Turaga, Srinivas C.; Arganda-Carreras, Ignacio; Ng, Lydia; Hawrylycz, Michael J.; Rockland, Kathleen; Seung, H. Sebastian; Osten, Pavel

    2014-01-01

    Understanding how brain activation mediates behaviors is a central goal of systems neuroscience. Here we apply an automated method for mapping brain activation in the mouse in order to probe how sex-specific social behaviors are represented in the male brain. Our method uses the immediate early gene c-fos, a marker of neuronal activation, visualized by serial two-photon tomography: the c-fos-GFP-positive neurons are computationally detected, their distribution is registered to a reference brain and a brain atlas, and their numbers are analyzed by statistical tests. Our results reveal distinct and shared female and male interaction-evoked patterns of male brain activation representing sex discrimination and social recognition. We also identify brain regions whose degree of activity correlates to specific features of social behaviors and estimate the total numbers and the densities of activated neurons per brain areas. Our study opens the door to automated screening of behavior-evoked brain activation in the mouse. PMID:25558063

  18. Automation in airport security X-ray screening of cabin baggage: Examining benefits and possible implementations of automated explosives detection.

    Science.gov (United States)

    Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian

    2018-10-01

    Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. 21 CFR 864.5600 - Automated hematocrit instrument.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hematocrit instrument. 864.5600 Section 864.5600 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  20. 21 CFR 862.2900 - Automated urinalysis system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated urinalysis system. 862.2900 Section 862....2900 Automated urinalysis system. (a) Identification. An automated urinalysis system is a device... that duplicate manual urinalysis systems. This device is used in conjunction with certain materials to...

  1. Resins production: batch plant automation

    International Nuclear Information System (INIS)

    Banti, M.; Mauri, G.

    1996-01-01

    Companies that look for automation in their plants without external resources, have at their disposal flexible, custom and easy to use DCS, open towards PLC. In this article it is explained why Hoechts has followed this way of new plants for resins production automation

  2. A map of directional genetic interactions in a metazoan cell.

    Science.gov (United States)

    Fischer, Bernd; Sandmann, Thomas; Horn, Thomas; Billmann, Maximilian; Chaudhary, Varun; Huber, Wolfgang; Boutros, Michael

    2015-03-06

    Gene-gene interactions shape complex phenotypes and modify the effects of mutations during development and disease. The effects of statistical gene-gene interactions on phenotypes have been used to assign genes to functional modules. However, directional, epistatic interactions, which reflect regulatory relationships between genes, have been challenging to map at large-scale. Here, we used combinatorial RNA interference and automated single-cell phenotyping to generate a large genetic interaction map for 21 phenotypic features of Drosophila cells. We devised a method that combines genetic interactions on multiple phenotypes to reveal directional relationships. This network reconstructed the sequence of protein activities in mitosis. Moreover, it revealed that the Ras pathway interacts with the SWI/SNF chromatin-remodelling complex, an interaction that we show is conserved in human cancer cells. Our study presents a powerful approach for reconstructing directional regulatory networks and provides a resource for the interpretation of functional consequences of genetic alterations.

  3. Small cities face greater impact from automation

    Science.gov (United States)

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  4. Small cities face greater impact from automation.

    Science.gov (United States)

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  5. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  6. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  7. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  8. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  9. 21 CFR 864.5620 - Automated hemoglobin system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  10. 21 CFR 864.5200 - Automated cell counter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  11. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  12. 21 CFR 864.5850 - Automated slide spinner.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  13. Improvement of Vivarium Biodecontamination through Data-acquisition Systems and Automation.

    Science.gov (United States)

    Devan, Shakthi Rk; Vasu, Suresh; Mallikarjuna, Yogesha; Ponraj, Ramkumar; Kamath, Gireesh; Poosala, Suresh

    2018-03-01

    Biodecontamination is important for eliminating pathogens at research animal facilities, thereby preventing contamination within barrier systems. We enhanced our facility's standard biodecontamination method to replace the traditional foggers, and the new system was used effectively after creating bypass ducts in HVAC units so that individual rooms could be isolated. The entire system was controlled by inhouse-developed supervisory control and data-acquisition software that supported multiple cycles of decontamination by equipment, which had different decontamination capacities, operated in parallel, and used different agents, including H2O2 vapor and ClO2 gas. The process was validated according to facility mapping, and effectiveness was assessed by using biologic (Geobacillus stearothermophilus) and chemical indicator strips, which were positioned before decontamination, and by sampling contact plates after the completion of each cycle. The results of biologic indicators showed 6-log reduction in microbial counts after successful decontamination cycles for both agents and found to be compatible with clean-room panels including commonly used materials in vivarium such as racks, cages, trolleys, cage changing stations, biosafety cabinets, refrigerators and other equipment in both procedure and animal rooms. In conclusion, the automated process enabled users to perform effective decontamination through multiple cycles with realtime documentation and provided additional capability to deal with potential outbreaks. Enabling software integration of automation improved quality-control systems in our vivarium.

  14. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  15. Geologic mapping using LANDSAT data

    Science.gov (United States)

    Siegal, B. S.; Abrams, M. J.

    1976-01-01

    The feasibility of automated classification for lithologic mapping with LANDSAT digital data was evaluated using three classification algorithms. The two supervised algorithms analyzed, a linear discriminant analysis algorithm and a hybrid algorithm which incorporated the Parallelepiped algorithm and the Bayesian maximum likelihood function, were comparable in terms of accuracy; however, classification was only 50 per cent accurate. The linear discriminant analysis algorithm was three times as efficient as the hybrid approach. The unsupervised classification technique, which incorporated the CLUS algorithm, delineated the major lithologic boundaries and, in general, correctly classified the most prominent geologic units. The unsupervised algorithm was not as efficient nor as accurate as the supervised algorithms. Analysis of spectral data for the lithologic units in the 0.4 to 2.5 microns region indicated that a greater separability of the spectral signatures could be obtained using wavelength bands outside the region sensed by LANDSAT.

  16. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  17. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  18. Automation Revolutionize the Business Service Industry

    OpenAIRE

    Marciniak, Róbert

    2017-01-01

    In the last decades significant disruptive changes began with the extended use of automation. Many jobs are changed or disappeared and others were born totally with the automation. Together with the progress of technology, the automation was primarily spread in the industrial sector, mostly in the production and assembly lines. The growth maycontinue in the future further, researchers expect more than 35 million industrial robots globally by 2018.But it shades the situati...

  19. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  20. Levels of automation and user control - evaluation of a turbine automation interface

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Jonas (Chalmers Univ. of Technology (Sweden))

    2008-10-15

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (author)

  1. Levels of automation and user control - evaluation of a turbine automation interface

    International Nuclear Information System (INIS)

    Andersson, Jonas

    2008-10-01

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (au)

  2. Future Autonomous and Automated Systems Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Trust is the greatest obstacle to implementing greater autonomy and automation (A&A) in the human spaceflight program. The Future Autonomous and Automated...

  3. Fatigue and voluntary utilization of automation in simulated driving.

    Science.gov (United States)

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  4. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    1994-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  5. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  6. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  7. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  8. Automated data collection in single particle electron microscopy

    Science.gov (United States)

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  9. Automated customized retrieval of radiotherapy data for clinical trials, audit and research.

    Science.gov (United States)

    Romanchikova, Marina; Harrison, Karl; Burnet, Neil G; Hoole, Andrew Cf; Sutcliffe, Michael Pf; Parker, Michael Andrew; Jena, Rajesh; Thomas, Simon James

    2018-02-01

    To enable fast and customizable automated collection of radiotherapy (RT) data from tomotherapy storage. Human-readable data maps (TagMaps) were created to generate DICOM-RT (Digital Imaging and Communications in Medicine standard for Radiation Therapy) data from tomotherapy archives, and provided access to "hidden" information comprising delivery sinograms, positional corrections and adaptive-RT doses. 797 data sets totalling 25,000 scans were batch-exported in 31.5 h. All archived information was restored, including the data not available via commercial software. The exported data were DICOM-compliant and compatible with major commercial tools including RayStation, Pinnacle and ProSoma. The export ran without operator interventions. The TagMap method for DICOM-RT data modelling produced software that was many times faster than the vendor's solution, required minimal operator input and delivered high volumes of vendor-identical DICOM data. The approach is applicable to many clinical and research data processing scenarios and can be adapted to recover DICOM-RT data from other proprietary storage types such as Elekta, Pinnacle or ProSoma. Advances in knowledge: A novel method to translate data from proprietary storage to DICOM-RT is presented. It provides access to the data hidden in electronic archives, offers a working solution to the issues of data migration and vendor lock-in and paves the way for large-scale imaging and radiomics studies.

  10. Low Cost Multi-Sensor Robot Laser Scanning System and its Accuracy Investigations for Indoor Mapping Application

    Science.gov (United States)

    Chen, C.; Zou, X.; Tian, M.; Li, J.; Wu, W.; Song, Y.; Dai, W.; Yang, B.

    2017-11-01

    In order to solve the automation of 3D indoor mapping task, a low cost multi-sensor robot laser scanning system is proposed in this paper. The multiple-sensor robot laser scanning system includes a panorama camera, a laser scanner, and an inertial measurement unit and etc., which are calibrated and synchronized together to achieve simultaneously collection of 3D indoor data. Experiments are undertaken in a typical indoor scene and the data generated by the proposed system are compared with ground truth data collected by a TLS scanner showing an accuracy of 99.2% below 0.25 meter, which explains the applicability and precision of the system in indoor mapping applications.

  11. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  12. Identifying Requirements for Effective Human-Automation Teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; John O' Hara; Heather D. Medema; Johanna H. Oxstrand

    2014-06-01

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based on a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.

  13. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  14. Automated segmentation of dental CBCT image with prior-guided sequential random forests

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599-7513 (United States); Chen, Ken-Chung; Tang, Zhen [Surgical Planning Laboratory, Department of Oral and Maxillofacial Surgery, Houston Methodist Research Institute, Houston, Texas 77030 (United States); Xia, James J., E-mail: dgshen@med.unc.edu, E-mail: JXia@HoustonMethodist.org [Surgical Planning Laboratory, Department of Oral and Maxillofacial Surgery, Houston Methodist Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery, Shanghai Jiao Tong University School of Medicine, Shanghai Ninth People’s Hospital, Shanghai 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu, E-mail: JXia@HoustonMethodist.org [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599-7513 and Department of Brain and Cognitive Engineering, Korea University, Seoul 02841 (Korea, Republic of)

    2016-01-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimate the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method

  15. Automated segmentation of dental CBCT image with prior-guided sequential random forests

    International Nuclear Information System (INIS)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Chen, Ken-Chung; Tang, Zhen; Xia, James J.; Shen, Dinggang

    2016-01-01

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimate the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method

  16. Automated Assessment in Massive Open Online Courses

    Science.gov (United States)

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  17. Automation trust and attention allocation in multitasking workspace.

    Science.gov (United States)

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  18. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  19. Aviation Safety/Automation Program Conference

    Science.gov (United States)

    Morello, Samuel A. (Compiler)

    1990-01-01

    The Aviation Safety/Automation Program Conference - 1989 was sponsored by the NASA Langley Research Center on 11 to 12 October 1989. The conference, held at the Sheraton Beach Inn and Conference Center, Virginia Beach, Virginia, was chaired by Samuel A. Morello. The primary objective of the conference was to ensure effective communication and technology transfer by providing a forum for technical interchange of current operational problems and program results to date. The Aviation Safety/Automation Program has as its primary goal to improve the safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers.

  20. Programmable Automated Welding System (PAWS)

    Science.gov (United States)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.