WorldWideScience

Sample records for streamlined automated approaches

  1. Streamlined approach to waste management at CRL

    International Nuclear Information System (INIS)

    Adams, L.; Campbell, B.

    2011-01-01

    Radioactive, mixed, hazardous and non-hazardous wastes have been and continue to be generated at Chalk River Laboratories (CRL) as a result of research and development activities and operations since the 1940s. Over the years, the wastes produced as a byproduct of activities delivering the core missions of the CRL site have been of many types, and today, over thirty distinct waste streams have been identified, all requiring efficient management. With the commencement of decommissioning of the legacy created as part of the development of the Canadian nuclear industry, the volumes and range of wastes to be managed have been increasing in the near term, and this trend will continue into the future. The development of a streamlined approach to waste management is a key to successful waste management at CRL. Waste management guidelines that address all of the requirements have become complex, and so have the various waste management groups receiving waste, with their many different processes and capabilities. This has led to difficulties for waste generators in understanding all of the requirements to be satisfied for the various CRL waste receivers, whose primary concerns are to be safe and in compliance with their acceptance criteria and license conditions. As a result, waste movement on site can often be very slow, especially for non-routine waste types. Recognizing an opportunity for improvement, the Waste Management organization at CRL has implemented a more streamlined approach with emphasis on early identification of waste type and possible disposition path. This paper presents a streamlined approach to waste identification and waste management at CRL, the implementation methodology applied and the early results achieved from this process improvement. (author)

  2. ExoMars Trace Gas Orbiter Instrument Modelling Approach to Streamline Science Operations

    Science.gov (United States)

    Munoz Fernandez, Michela; Frew, David; Ashman, Michael; Cardesin Moinelo, Alejandro; Garcia Beteta, Juan Jose; Geiger, Bernhard; Metcalfe, Leo; Nespoli, Federico; Muniz Solaz, Carlos

    2018-05-01

    ExoMars Trace Gas Orbiter (TGO) science operations activities are centralised at ESAC's Science Operations Centre (SOC). The SOC receives the inputs from the principal investigators (PIs) in order to implement and deliver the spacecraft pointing requests and instrument timelines to the Mission Operations Centre (MOC). The high number of orbits per planning cycle has made it necessary to abstract the planning interactions between the SOC and the PI teams at the observation level. This paper describes the modelling approach we have conducted for TGOís instruments to streamline science operations. We have created dynamic observation types that scale to adapt to the conditions specified by the PI teams including observation timing, and pointing block parameters calculated from observation geometry. This approach is considered and improvement with respect to previous missions where the generation of the observation pointing and commanding requests was performed manually by the instrument teams. Automation software assists us to effectively handle the high density of planned orbits with increasing volume of scientific data and to successfully meet opportunistic scientific goals and objectives. Our planning tool combines the instrument observation definition files provided by the PIs together with the flight dynamics products to generate the Pointing Requests and the instrument timeline (ITL). The ITL contains all the validated commands at the TC sequence level and computes the resource envelopes (data rate, power, data volume) within the constraints. At the SOC, our main goal is to maximise the science output while minimising the number of iterations among the teams, ensuring that the timeline does not violate the state transitions allowed in the Mission Operations Rules and Constraints Document.

  3. Smart management of sample dilution using an artificial neural network to achieve streamlined processes and saving resources: the automated nephelometric testing of serum free light chain as case study.

    Science.gov (United States)

    Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio

    2017-02-01

    Saving resources is a paramount issue for the modern laboratory, and new trainable as well as smart technologies can be used to allow the automated instrumentation to manage samples more efficiently in order to achieve streamlined processes. In this regard the serum free light chain (sFLC) testing represents an interesting challenge, as it usually causes using a number of assays before achieving an acceptable result within the analytical range. An artificial neural network based on the multi-layer perceptron (MLP-ANN) was used to infer the starting dilution status of sFLC samples based on the information available through the laboratory information system (LIS). After the learning phase, the MLP-ANN simulation was applied to the nephelometric testing routinely performed in our laboratory on a BN ProSpec® System analyzer (Siemens Helathcare) using the N Latex FLC kit. The MLP-ANN reduced the serum kappa free light chain (κ-FLC) and serum lambda free light chain (λ-FLC) wasted tests by 69.4% and 70.8% with respect to the naïve stepwise dilution scheme used by the automated analyzer, and by 64.9% and 66.9% compared to a "rational" dilution scheme based on a 4-step dilution. Although it was restricted to follow-up samples, the MLP-ANN showed good predictive performance, which alongside the possibility to implement it in any automated system, made it a suitable solution for achieving streamlined laboratory processes and saving resources.

  4. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  5. Streamlined bioreactor-based production of human cartilage tissues.

    Science.gov (United States)

    Tonnarelli, B; Santoro, R; Adelaide Asnaghi, M; Wendt, D

    2016-05-27

    Engineered tissue grafts have been manufactured using methods based predominantly on traditional labour-intensive manual benchtop techniques. These methods impart significant regulatory and economic challenges, hindering the successful translation of engineered tissue products to the clinic. Alternatively, bioreactor-based production systems have the potential to overcome such limitations. In this work, we present an innovative manufacturing approach to engineer cartilage tissue within a single bioreactor system, starting from freshly isolated human primary chondrocytes, through the generation of cartilaginous tissue grafts. The limited number of primary chondrocytes that can be isolated from a small clinically-sized cartilage biopsy could be seeded and extensively expanded directly within a 3D scaffold in our perfusion bioreactor (5.4 ± 0.9 doublings in 2 weeks), bypassing conventional 2D expansion in flasks. Chondrocytes expanded in 3D scaffolds better maintained a chondrogenic phenotype than chondrocytes expanded on plastic flasks (collagen type II mRNA, 18-fold; Sox-9, 11-fold). After this "3D expansion" phase, bioreactor culture conditions were changed to subsequently support chondrogenic differentiation for two weeks. Engineered tissues based on 3D-expanded chondrocytes were more cartilaginous than tissues generated from chondrocytes previously expanded in flasks. We then demonstrated that this streamlined bioreactor-based process could be adapted to effectively generate up-scaled cartilage grafts in a size with clinical relevance (50 mm diameter). Streamlined and robust tissue engineering processes, as the one described here, may be key for the future manufacturing of grafts for clinical applications, as they facilitate the establishment of compact and closed bioreactor-based production systems, with minimal automation requirements, lower operating costs, and increased compliance to regulatory guidelines.

  6. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    Science.gov (United States)

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  7. Streamlining and automation of radioanalytical methods at a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, J.T.; Dillard, J.W. [IT Corp., Knoxville, TN (United States)

    1993-12-31

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed.

  8. Streamlining and automation of radioanalytical methods at a commercial laboratory

    International Nuclear Information System (INIS)

    Harvey, J.T.; Dillard, J.W.

    1993-01-01

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed

  9. Expert system isssues in automated, autonomous space vehicle rendezvous

    Science.gov (United States)

    Goodwin, Mary Ann; Bochsler, Daniel C.

    1987-01-01

    The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.

  10. Approaches to automated protein crystal harvesting

    Energy Technology Data Exchange (ETDEWEB)

    Deller, Marc C., E-mail: mdeller@scripps.edu; Rupp, Bernhard, E-mail: mdeller@scripps.edu

    2014-01-28

    Approaches to automated and robot-assisted harvesting of protein crystals are critically reviewed. While no true turn-key solutions for automation of protein crystal harvesting are currently available, systems incorporating advanced robotics and micro-electromechanical systems represent exciting developments with the potential to revolutionize the way in which protein crystals are harvested.

  11. Cognitive Approaches to Automated Instruction.

    Science.gov (United States)

    Regian, J. Wesley, Ed.; Shute, Valerie J., Ed.

    This book contains a snapshot of state-of-the-art research on the design of automated instructional systems. Selected cognitive psychologists were asked to describe their approach to instruction and cognitive diagnosis, the theoretical basis of the approach, its utility and applicability, and the knowledge engineering or task analysis methods…

  12. Self-optimizing approach for automated laser resonator alignment

    Science.gov (United States)

    Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.

    2012-02-01

    Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.

  13. STREAMLINED APPROACH FOR ENVIRONMENTAL RESTORATION PLAN FOR CORRECTIVE ACTION UNIT 116: AREA 25 TEST CELL C FACILITY NEVADA TEST SITE, NEVADA

    International Nuclear Information System (INIS)

    2006-01-01

    This Streamlined Approach for Environmental Restoration Plan identifies the activities required for the closure of Corrective Action Unit 116, Area 25 Test Cell C Facility. The Test Cell C Facility is located in Area 25 of the Nevada Test Site approximately 25 miles northwest of Mercury, Nevada

  14. Air Force Information Workflow Automation through Synchronized Air Power Management (SAPM)

    National Research Council Canada - National Science Library

    Benkley, Carl; Chang, Irene; Crowley, John; Oristian, Thomas

    2004-01-01

    .... Implementing Extensible Markup Language (XML) messages, web services, and workflow automation, SAPM expands existing web-based capabilities, enables machine-to-machine interfaces, and streamlines the war fighter kill chain process...

  15. Lessons learned in streamlining the preparation of SNM standard solutions

    International Nuclear Information System (INIS)

    Clark, J.P.; Johnson, S.R.

    1986-01-01

    Improved safeguard measurements have produced a demand for greater quantities of reliable SNM solution standards. At the Savannah River Plant (SRP), the demand for these standards has been met by several innovations to improve the productivity and reliability of standards preparations. With the use of computer controlled balance, large batches of SNM stock solutions are prepared on a gravimetric basis. Accurately dispensed quantities of the stock solution are weighed and stored in bottles. When needed, they are quantitatively transferred to tared containers, matrix adjusted to target concentrations, weighed, and measured for density at 25 0 C. Concentrations of SNM are calculated both gravimetrically and volumetrically. Calculated values are confirmed analytically before the standards are used in measurement control program (MCP) activities. The lessons learned include: MCP goals include error identification and management. Strategy modifications are required to improve error management. Administrative controls can minimize certain types of errors. Automation can eliminate redundancy and streamline preparations. Prudence and simplicity enhance automation success. The effort expended to increase productivity has increased the reliability of standards and provided better documentation for quality assurance

  16. High-throughput, 384-well, LC-MS/MS CYP inhibition assay using automation, cassette-analysis technique, and streamlined data analysis.

    Science.gov (United States)

    Halladay, Jason S; Delarosa, Erlie Marie; Tran, Daniel; Wang, Leslie; Wong, Susan; Khojasteh, S Cyrus

    2011-08-01

    Here we describe a high capacity and high-throughput, automated, 384-well CYP inhibition assay using well-known HLM-based MS probes. We provide consistently robust IC(50) values at the lead optimization stage of the drug discovery process. Our method uses the Agilent Technologies/Velocity11 BioCel 1200 system, timesaving techniques for sample analysis, and streamlined data processing steps. For each experiment, we generate IC(50) values for up to 344 compounds and positive controls for five major CYP isoforms (probe substrate): CYP1A2 (phenacetin), CYP2C9 ((S)-warfarin), CYP2C19 ((S)-mephenytoin), CYP2D6 (dextromethorphan), and CYP3A4/5 (testosterone and midazolam). Each compound is incubated separately at four concentrations with each CYP probe substrate under the optimized incubation condition. Each incubation is quenched with acetonitrile containing the deuterated internal standard of the respective metabolite for each probe substrate. To minimize the number of samples to be analyzed by LC-MS/MS and reduce the amount of valuable MS runtime, we utilize timesaving techniques of cassette analysis (pooling the incubation samples at the end of each CYP probe incubation into one) and column switching (reducing the amount of MS runtime). Here we also report on the comparison of IC(50) results for five major CYP isoforms using our method compared to values reported in the literature.

  17. System approach to automation and robotization of drivage

    Science.gov (United States)

    Zinov’ev, VV; Mayorov, AE; Starodubov, AN; Nikolaev, PI

    2018-03-01

    The authors consider the system approach to finding ways of no-man drilling and blasting in the face area by means of automation and robotization of operations with a view to reducing injuries in mines. The analysis is carried out in terms of the drilling and blasting technology applied in Makarevskoe Coal Field, Kuznetsk Coal Basin. Within the system-functional approach and using INDEFO procedure, the processes of drilling and blasthole charging are decomposed into related elementary operations. The automation and robotization methods to avoid the presence of miners in the face are found for each operation.

  18. Impact assessment: Eroding benefits through streamlining?

    Energy Technology Data Exchange (ETDEWEB)

    Bond, Alan, E-mail: alan.bond@uea.ac.uk [School of Environmental Sciences, University of East Anglia (United Kingdom); School of Geo and Spatial Sciences, North-West University (South Africa); Pope, Jenny, E-mail: jenny@integral-sustainability.net [Integral Sustainability (Australia); Curtin University Sustainability Policy Institute (Australia); Morrison-Saunders, Angus, E-mail: A.Morrison-Saunders@murdoch.edu.au [School of Geo and Spatial Sciences, North-West University (South Africa); Environmental Science, Murdoch University (Australia); Retief, Francois, E-mail: francois.retief@nwu.ac.za [School of Geo and Spatial Sciences, North-West University (South Africa); Gunn, Jill A.E., E-mail: jill.gunn@usask.ca [Department of Geography and Planning and School of Environment and Sustainability, University of Saskatchewan (Canada)

    2014-02-15

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environment and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment.

  19. Impact assessment: Eroding benefits through streamlining?

    International Nuclear Information System (INIS)

    Bond, Alan; Pope, Jenny; Morrison-Saunders, Angus; Retief, Francois; Gunn, Jill A.E.

    2014-01-01

    This paper argues that Governments have sought to streamline impact assessment in recent years (defined as the last five years) to counter concerns over the costs and potential for delays to economic development. We hypothesise that this has had some adverse consequences on the benefits that subsequently accrue from the assessments. This hypothesis is tested using a framework developed from arguments for the benefits brought by Environmental Impact Assessment made in 1982 in the face of the UK Government opposition to its implementation in a time of economic recession. The particular benefits investigated are ‘consistency and fairness’, ‘early warning’, ‘environment and development’, and ‘public involvement’. Canada, South Africa, the United Kingdom and Western Australia are the jurisdictions tested using this framework. The conclusions indicate that significant streamlining has been undertaken which has had direct adverse effects on some of the benefits that impact assessment should deliver, particularly in Canada and the UK. The research has not examined whether streamlining has had implications for the effectiveness of impact assessment, but the causal link between streamlining and benefits does sound warning bells that merit further investigation. -- Highlights: • Investigation of the extent to which government has streamlined IA. • Evaluation framework was developed based on benefits of impact assessment. • Canada, South Africa, the United Kingdom, and Western Australia were examined. • Trajectory in last five years is attrition of benefits of impact assessment

  20. Will the Measurement Robots Take Our Jobs? An Update on the State of Automated M&V for Energy Efficiency Programs

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Touzani, Samir [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Taylor, Cody [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fernandes, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-28

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifying savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.

  1. Lean coding machine. Facilities target productivity and job satisfaction with coding automation.

    Science.gov (United States)

    Rollins, Genna

    2010-07-01

    Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.

  2. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Qualification of academic facilities for small-scale automated manufacture of autologous cell-based products.

    Science.gov (United States)

    Hourd, Paul; Chandra, Amit; Alvey, David; Ginty, Patrick; McCall, Mark; Ratcliffe, Elizabeth; Rayment, Erin; Williams, David J

    2014-01-01

    Academic centers, hospitals and small companies, as typical development settings for UK regenerative medicine assets, are significant contributors to the development of autologous cell-based therapies. Often lacking the appropriate funding, quality assurance heritage or specialist regulatory expertise, qualifying aseptic cell processing facilities for GMP compliance is a significant challenge. The qualification of a new Cell Therapy Manufacturing Facility with automated processing capability, the first of its kind in a UK academic setting, provides a unique demonstrator for the qualification of small-scale, automated facilities for GMP-compliant manufacture of autologous cell-based products in these settings. This paper shares our experiences in qualifying the Cell Therapy Manufacturing Facility, focusing on our approach to streamlining the qualification effort, the challenges, project delays and inefficiencies we encountered, and the subsequent lessons learned.

  4. Modern approaches to agent-based complex automated negotiation

    CERN Document Server

    Bai, Quan; Ito, Takayuki; Zhang, Minjie; Ren, Fenghui; Aydoğan, Reyhan; Hadfi, Rafik

    2017-01-01

    This book addresses several important aspects of complex automated negotiations and introduces a number of modern approaches for facilitating agents to conduct complex negotiations. It demonstrates that autonomous negotiation is one of the most important areas in the field of autonomous agents and multi-agent systems. Further, it presents complex automated negotiation scenarios that involve negotiation encounters that may have, for instance, a large number of agents, a large number of issues with strong interdependencies and/or real-time constraints.

  5. ACHP | News | ACHP Issues Program Comment to Streamline Communication

    Science.gov (United States)

    Program Comment to Streamline Communication Facilities Construction and Modification ACHP Issues Program Comment to Streamline Communication Facilities Construction and Modification The Advisory Council on

  6. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  7. InterviewStreamliner, a minimalist, free, open source, relational approach to computer-assisted qualitative data analysis software

    NARCIS (Netherlands)

    H.D. Pruijt (Hans)

    2010-01-01

    textabstractInterviewStreamliner is a free, open source, minimalist alternative to complex computer-assisted qualitative data analysis packages. It builds on the flexibility of relational database management technology.

  8. Industrial Compositional Streamline Simulation for Efficient and Accurate Prediction of Gas Injection and WAG Processes

    Energy Technology Data Exchange (ETDEWEB)

    Margot Gerritsen

    2008-10-31

    number of streamlines to number of threads is sufficiently high, which is the case in real-field applications. This is an important result, as it eases the transition of serial to parallel streamline codes. The parallel speedup itself depends on the relative contribution of the tracing and mapping stages as compared to the solution of the transport equations along streamlines. As the physical complexity of the simulated 1D transport process increases, the contribution of the less efficient tracing and mapping stages is reduced and near-linear scalabilities can be obtained. Our work clearly shows that the owner approach, in which threads are assigned whole streamlines, is more attractive than a distributed model, in which streamline segments are assigned to threads, because it allows re-use of existing sequential code for the 1D streamline solves, also for implicit time-stepping algorithms.

  9. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  10. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  11. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  12. Fleet Sizing of Automated Material Handling Using Simulation Approach

    Science.gov (United States)

    Wibisono, Radinal; Ai, The Jin; Ratna Yuniartha, Deny

    2018-03-01

    Automated material handling tends to be chosen rather than using human power in material handling activity for production floor in manufacturing company. One critical issue in implementing automated material handling is designing phase to ensure that material handling activity more efficient in term of cost spending. Fleet sizing become one of the topic in designing phase. In this research, simulation approach is being used to solve fleet sizing problem in flow shop production to ensure optimum situation. Optimum situation in this research means minimum flow time and maximum capacity in production floor. Simulation approach is being used because flow shop can be modelled into queuing network and inter-arrival time is not following exponential distribution. Therefore, contribution of this research is solving fleet sizing problem with multi objectives in flow shop production using simulation approach with ARENA Software

  13. Automating payroll, billing, and medical records. Using technology to do more with less.

    Science.gov (United States)

    Vetter, E

    1995-08-01

    As home care agencies grow, so does the need to streamline the paperwork involved in running an agency. One agency found a way to reduce its payroll, billing, and medical records paperwork by implementing an automated, image-based data collection system that saves time, money, and paper.

  14. Scientific Evaluation and Review of Claims in Health Care (SEaRCH): A Streamlined, Systematic, Phased Approach for Determining "What Works" in Healthcare.

    Science.gov (United States)

    Jonas, Wayne B; Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela

    2017-01-01

    Answering the question of "what works" in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL © ), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods-CAP, REAL, and EP-can be integrated into a strategic approach to help answer the question "what works in healthcare?" and what it means in a comprehensive way. SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and wellness.

  15. Creating customer value by streamlining business processes.

    Science.gov (United States)

    Vantrappen, H

    1992-02-01

    Much of the strategic preoccupation of senior managers in the 1990s is focusing on the creation of customer value. Companies are seeking competitive advantage by streamlining the three processes through which they interact with their customers: product creation, order handling and service assurance. 'Micro-strategy' is a term which has been coined for the trade-offs and decisions on where and how to streamline these three processes. The article discusses micro-strategies applied by successful companies.

  16. Hydrodynamic Drag on Streamlined Projectiles and Cavities

    KAUST Repository

    Jetly, Aditya

    2016-04-19

    The air cavity formation resulting from the water-entry of solid objects has been the subject of extensive research due to its application in various fields such as biology, marine vehicles, sports and oil and gas industries. Recently we demonstrated that at certain conditions following the closing of the air cavity formed by the initial impact of a superhydrophobic sphere on a free water surface a stable streamlined shape air cavity can remain attached to the sphere. The formation of superhydrophobic sphere and attached air cavity reaches a steady state during the free fall. In this thesis we further explore this novel phenomenon to quantify the drag on streamlined shape cavities. The drag on the sphere-cavity formation is then compared with the drag on solid projectile which were designed to have self-similar shape to that of the cavity. The solid projectiles of adjustable weight were produced using 3D printing technique. In a set of experiments on the free fall of projectile we determined the variation of projectiles drag coefficient as a function of the projectiles length to diameter ratio and the projectiles specific weight, covering a range of intermediate Reynolds number, Re ~ 104 – 105 which are characteristic for our streamlined cavity experiments. Parallel free fall experiment with sphere attached streamlined air cavity and projectile of the same shape and effective weight clearly demonstrated the drag reduction effect due to the stress-free boundary condition at cavity liquid interface. The streamlined cavity experiments can be used as the upper bound estimate of the drag reduction by air layers naturally sustained on superhydrophobic surfaces in contact with water. In the final part of the thesis we design an experiment to test the drag reduction capacity of robust superhydrophobic coatings deposited on the surface of various model vessels.

  17. The design of the Comet streamliner: An electric land speed record motorcycle

    Science.gov (United States)

    McMillan, Ethan Alexander

    The development of the land speed record electric motorcycle streamliner, the Comet, is discussed herein. Its design process includes a detailed literary review of past and current motorcycle streamliners in an effort to highlight the main components of such a vehicle's design, while providing baseline data for performance comparisons. A new approach to balancing a streamliner at low speeds is also addressed, a system henceforth referred to as landing gear, which has proven an effective means for allowing the driver to control the low speed instabilities of the vehicle with relative ease compared to tradition designs. This is accompanied by a dynamic stability analysis conducted on a test chassis that was developed for the primary purpose of understanding the handling dynamics of streamliners, while also providing a test bed for the implementation of the landing gear system and a means to familiarize the driver to the operation and handling of such a vehicle. Data gathered through the use of GPS based velocity tracking, accelerometers, and a linear potentiometer provided a means to validate a dynamic stability analysis of the weave and wobble modes of the vehicle through linearization of a streamliner model developed in the BikeSIM software suite. Results indicate agreement between the experimental data and the simulation, indicating that the conventional recumbent design of a streamliner chassis is in fact highly stable throughout the performance envelope beyond extremely low speeds. A computational fluid dynamics study was also performed, utilized in the development of the body of the Comet to which a series of tests were conducted in order to develop a shape that was both practical to transport and highly efficient. By creating a hybrid airfoil from a NACA 0018 and NACA 66-018, a drag coefficient of 0.1 and frontal area of 0.44 m2 has been found for the final design. Utilizing a performance model based on the proposed vehicle's motor, its rolling resistance, and

  18. Automated genotyping of dinucleotide repeat markers

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Hoffman, E.P. [Carnegie Mellon Univ., Pittsburgh, PA (United States)]|[Univ. of Pittsburgh, PA (United States)

    1994-09-01

    The dinucleotide repeats (i.e., microsatellites) such as CA-repeats are a highly polymorphic, highly abundant class of PCR-amplifiable markers that have greatly streamlined genetic mapping experimentation. It is expected that over 30,000 such markers (including tri- and tetranucleotide repeats) will be characterized for routine use in the next few years. Since only size determination, and not sequencing, is required to determine alleles, in principle, dinucleotide repeat genotyping is easily performed on electrophoretic gels, and can be automated using DNA sequencers. Unfortunately, PCR stuttering with these markers generates not one band for each allele, but a pattern of bands. Since closely spaced alleles must be disambiguated by human scoring, this poses a key obstacle to full automation. We have developed methods that overcome this obstacle. Our model is that the observed data is generated by arithmetic superposition (i.e., convolution) of multiple allele patterns. By quantitatively measuring the size of each component band, and exploiting the unique stutter pattern associated with each marker, closely spaced alleles can be deconvolved; this unambiguously reconstructs the {open_quotes}true{close_quotes} allele bands, with stutter artifact removed. We used this approach in a system for automated diagnosis of (X-linked) Duchenne muscular dystrophy; four multiplexed CA-repeats within the dystrophin gene were assayed on a DNA sequencer. Our method accurately detected small variations in gel migration that shifted the allele size estimate. In 167 nonmutated alleles, 89% (149/167) showed no size variation, 9% (15/167) showed 1 bp variation, and 2% (3/167) showed 2 bp variation. We are currently developing a library of dinucleotide repeat patterns; together with our deconvolution methods, this library will enable fully automated genotyping of dinucleotide repeats from sizing data.

  19. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S [University of Nebraska Medical Center, Omaha, NE (United States)

    2016-06-15

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm{sup 2}. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  20. SU-G-BRB-04: Automated Output Factor Measurements Using Continuous Data Logging for Linac Commissioning

    International Nuclear Information System (INIS)

    Zhu, X; Li, S; Zheng, D; Wang, S; Lei, Y; Zhang, M; Ma, R; Fan, Q; Wang, X; Li, X; Verma, V; Enke, C; Zhou, S

    2016-01-01

    Purpose: Linac commissioning is a time consuming and labor intensive process, the streamline of which is highly desirable. In particular, manual measurement of output factors for a variety of field sizes and energy greatly hinders the commissioning efficiency. In this study, automated measurement of output factors was demonstrated as ‘one-click’ using data logging of an electrometer. Methods: Beams to be measured were created in the recording and verifying (R&V) system and configured for continuous delivery. An electrometer with an automatic data logging feature enabled continuous data collection for all fields without human intervention. The electrometer saved data into a spreadsheet every 0.5 seconds. A Matlab program was developed to analyze the excel data to monitor and check the data quality. Results: For each photon energy, output factors were measured for five configurations, including open field and four wedges. Each configuration includes 72 fields sizes, ranging from 4×4 to 20×30 cm"2. Using automation, it took 50 minutes to complete the measurement of 72 field sizes, in contrast to 80 minutes when using the manual approach. The automation avoided the necessity of redundant Linac status checks between fields as in the manual approach. In fact, the only limiting factor in such automation is Linac overheating. The data collection beams in the R&V system are reusable, and the simplified process is less error-prone. In addition, our Matlab program extracted the output factors faithfully from data logging, and the discrepancy between the automatic and manual measurement is within ±0.3%. For two separate automated measurements 30 days apart, consistency check shows a discrepancy within ±1% for 6MV photon with a 60 degree wedge. Conclusion: Automated output factor measurements can save time by 40% when compared with conventional manual approach. This work laid ground for further improvement for the automation of Linac commissioning.

  1. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  2. Scientific Evaluation and Review of Claims in Health Care (SEaRCH): A Streamlined, Systematic, Phased Approach for Determining “What Works” in Healthcare

    Science.gov (United States)

    Crawford, Cindy; Hilton, Lara; Elfenbaum, Pamela

    2017-01-01

    Abstract Background: Answering the question of “what works” in healthcare can be complex and requires the careful design and sequential application of systematic methodologies. Over the last decade, the Samueli Institute has, along with multiple partners, developed a streamlined, systematic, phased approach to this process called the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). The SEaRCH process provides an approach for rigorously, efficiently, and transparently making evidence-based decisions about healthcare claims in research and practice with minimal bias. Methods: SEaRCH uses three methods combined in a coordinated fashion to help determine what works in healthcare. The first, the Claims Assessment Profile (CAP), seeks to clarify the healthcare claim and question, and its ability to be evaluated in the context of its delivery. The second method, the Rapid Evidence Assessment of the Literature (REAL©), is a streamlined, systematic review process conducted to determine the quantity, quality, and strength of evidence and risk/benefit for the treatment. The third method involves the structured use of expert panels (EPs). There are several types of EPs, depending on the purpose and need. Together, these three methods—CAP, REAL, and EP—can be integrated into a strategic approach to help answer the question “what works in healthcare?” and what it means in a comprehensive way. Discussion: SEaRCH is a systematic, rigorous approach for evaluating healthcare claims of therapies, practices, programs, or products in an efficient and stepwise fashion. It provides an iterative, protocol-driven process that is customized to the intervention, consumer, and context. Multiple communities, including those involved in health service and policy, can benefit from this organized framework, assuring that evidence-based principles determine which healthcare practices with the greatest promise are used for improving the public's health and

  3. Streamline segment statistics of premixed flames with nonunity Lewis numbers

    Science.gov (United States)

    Chakraborty, Nilanjan; Wang, Lipo; Klein, Markus

    2014-03-01

    The interaction of flame and surrounding fluid motion is of central importance in the fundamental understanding of turbulent combustion. It is demonstrated here that this interaction can be represented using streamline segment analysis, which was previously applied in nonreactive turbulence. The present work focuses on the effects of the global Lewis number (Le) on streamline segment statistics in premixed flames in the thin-reaction-zones regime. A direct numerical simulation database of freely propagating thin-reaction-zones regime flames with Le ranging from 0.34 to 1.2 is used to demonstrate that Le has significant influences on the characteristic features of the streamline segment, such as the curve length, the difference in the velocity magnitude at two extremal points, and their correlations with the local flame curvature. The strengthenings of the dilatation rate, flame normal acceleration, and flame-generated turbulence with decreasing Le are principally responsible for these observed effects. An expression for the probability density function (pdf) of the streamline segment length, originally developed for nonreacting turbulent flows, captures the qualitative behavior for turbulent premixed flames in the thin-reaction-zones regime for a wide range of Le values. The joint pdfs between the streamline length and the difference in the velocity magnitude at two extremal points for both unweighted and density-weighted velocity vectors are analyzed and compared. Detailed explanations are provided for the observed differences in the topological behaviors of the streamline segment in response to the global Le.

  4. Streamlining cardiovascular clinical trials to improve efficiency and generalisability.

    Science.gov (United States)

    Zannad, Faiez; Pfeffer, Marc A; Bhatt, Deepak L; Bonds, Denise E; Borer, Jeffrey S; Calvo-Rojas, Gonzalo; Fiore, Louis; Lund, Lars H; Madigan, David; Maggioni, Aldo Pietro; Meyers, Catherine M; Rosenberg, Yves; Simon, Tabassome; Stough, Wendy Gattis; Zalewski, Andrew; Zariffa, Nevine; Temple, Robert

    2017-08-01

    Controlled trials provide the most valid determination of the efficacy and safety of an intervention, but large cardiovascular clinical trials have become extremely costly and complex, making it difficult to study many important clinical questions. A critical question, and the main objective of this review, is how trials might be simplified while maintaining randomisation to preserve scientific integrity and unbiased efficacy assessments. Experience with alternative approaches is accumulating, specifically with registry-based randomised controlled trials that make use of data already collected. This approach addresses bias concerns while still capitalising on the benefits and efficiencies of a registry. Several completed or ongoing trials illustrate the feasibility of using registry-based controlled trials to answer important questions relevant to daily clinical practice. Randomised trials within healthcare organisation databases may also represent streamlined solutions for some types of investigations, although data quality (endpoint assessment) is likely to be a greater concern in those settings. These approaches are not without challenges, and issues pertaining to informed consent, blinding, data quality and regulatory standards remain to be fully explored. Collaboration among stakeholders is necessary to achieve standards for data management and analysis, to validate large data sources for use in randomised trials, and to re-evaluate ethical standards to encourage research while also ensuring that patients are protected. The rapidly evolving efforts to streamline cardiovascular clinical trials have the potential to lead to major advances in promoting better care and outcomes for patients with cardiovascular disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. A streamlined ribosome profiling protocol for the characterization of microorganisms

    DEFF Research Database (Denmark)

    Latif, Haythem; Szubin, Richard; Tan, Justin

    2015-01-01

    Ribosome profiling is a powerful tool for characterizing in vivo protein translation at the genome scale, with multiple applications ranging from detailed molecular mechanisms to systems-level predictive modeling. Though highly effective, this intricate technique has yet to become widely used...... in the microbial research community. Here we present a streamlined ribosome profiling protocol with reduced barriers to entry for microbial characterization studies. Our approach provides simplified alternatives during harvest, lysis, and recovery of monosomes and also eliminates several time-consuming steps...

  6. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  7. A New Automated Instrument Calibration Facility at the Savannah River Site

    International Nuclear Information System (INIS)

    Polz, E.; Rushton, R.O.; Wilkie, W.H.; Hancock, R.C.

    1998-01-01

    The Health Physics Instrument Calibration Facility at the Savannah River Site in Aiken, SC was expressly designed and built to calibrate portable radiation survey instruments. The facility incorporates recent advances in automation technology, building layout and construction, and computer software to improve the calibration process. Nine new calibration systems automate instrument calibration and data collection. The building is laid out so that instruments are moved from one area to another in a logical, efficient manner. New software and hardware integrate all functions such as shipping/receiving, work flow, calibration, testing, and report generation. Benefits include a streamlined and integrated program, improved efficiency, reduced errors, and better accuracy

  8. Automated attribution of remotely-sensed ecological disturbances using spatial and temporal characteristics of common disturbance classes.

    Science.gov (United States)

    Cooper, L. A.; Ballantyne, A.

    2017-12-01

    Forest disturbances are critical components of ecosystems. Knowledge of their prevalence and impacts is necessary to accurately describe forest health and ecosystem services through time. While there are currently several methods available to identify and describe forest disturbances, especially those which occur in North America, the process remains inefficient and inaccessible in many parts of the world. Here, we introduce a preliminary approach to streamline and automate both the detection and attribution of forest disturbances. We use a combination of the Breaks for Additive Season and Trend (BFAST) detection algorithm to detect disturbances in combination with supervised and unsupervised classification algorithms to attribute the detections to disturbance classes. Both spatial and temporal disturbance characteristics are derived and utilized for the goal of automating the disturbance attribution process. The resulting preliminary algorithm is applied to up-scaled (100m) Landsat data for several different ecosystems in North America, with varying success. Our results indicate that supervised classification is more reliable than unsupervised classification, but that limited training data are required for a region. Future work will improve the algorithm through refining and validating at sites within North America before applying this approach globally.

  9. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  10. Towards Automated Binding Affinity Prediction Using an Iterative Linear Interaction Energy Approach

    Directory of Open Access Journals (Sweden)

    C. Ruben Vosmeer

    2014-01-01

    Full Text Available Binding affinity prediction of potential drugs to target and off-target proteins is an essential asset in drug development. These predictions require the calculation of binding free energies. In such calculations, it is a major challenge to properly account for both the dynamic nature of the protein and the possible variety of ligand-binding orientations, while keeping computational costs tractable. Recently, an iterative Linear Interaction Energy (LIE approach was introduced, in which results from multiple simulations of a protein-ligand complex are combined into a single binding free energy using a Boltzmann weighting-based scheme. This method was shown to reach experimental accuracy for flexible proteins while retaining the computational efficiency of the general LIE approach. Here, we show that the iterative LIE approach can be used to predict binding affinities in an automated way. A workflow was designed using preselected protein conformations, automated ligand docking and clustering, and a (semi-automated molecular dynamics simulation setup. We show that using this workflow, binding affinities of aryloxypropanolamines to the malleable Cytochrome P450 2D6 enzyme can be predicted without a priori knowledge of dominant protein-ligand conformations. In addition, we provide an outlook for an approach to assess the quality of the LIE predictions, based on simulation outcomes only.

  11. Automate Your Physical Plant Using the Building Block Approach.

    Science.gov (United States)

    Michaelson, Matt

    1998-01-01

    Illustrates how Mount Saint Vincent University (Halifax), by upgrading the control and monitoring of one building or section of the school at a time, could produce savings in energy and operating costs and improve the environment. Explains a gradual, "building block" approach to facility automation that provides flexibility without a…

  12. Novel diffusion tensor imaging technique reveals developmental streamline volume changes in the corticospinal tract associated with leg motor control.

    Science.gov (United States)

    Kamson, David O; Juhász, Csaba; Chugani, Harry T; Jeong, Jeong-Won

    2015-04-01

    Diffusion tensor imaging (DTI) has expanded our knowledge of corticospinal tract (CST) anatomy and development. However, previous developmental DTI studies assessed the CST as a whole, overlooking potential differences in development of its components related to control of the upper and lower extremities. The present cross-sectional study investigated age-related changes, side and gender differences in streamline volume of the leg- and hand-related segments of the CST in children. DTI data of 31 children (1-14 years; mean age: 6±4 years; 17 girls) with normal conventional MRI were analyzed. Leg- and hand-related CST streamline volumes were quantified separately, using a recently validated novel tractography approach. CST streamline volumes on both sides were compared between genders and correlated with age. Higher absolute streamline volumes were found in the left leg-related CST compared to the right (p=0.001) without a gender effect (p=0.4), whereas no differences were found in the absolute hand-related CST volumes (p>0.4). CST leg-related streamline volumes, normalized to hemispheric white matter volumes, declined with age in the right hemisphere only (R=-.51; p=0.004). Absolute leg-related CST streamline volumes showed similar, but slightly weaker correlations. Hand-related absolute or normalized CST streamline volumes showed no age-related variations on either side. These results suggest differential development of CST segments controlling hand vs. leg movements. Asymmetric volume changes in the lower limb motor pathway may be secondary to gradually strengthening left hemispheric dominance and is consistent with previous data suggesting that footedness is a better predictor of hemispheric lateralization than handedness. Copyright © 2014 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  13. Streamlining of the Decontamination and Demolition Document Preparation Process

    International Nuclear Information System (INIS)

    Durand, Nick; Meincke, Carol; Peek, Georgianne

    1999-01-01

    During the past five years, the Sandia National Labo- ratories Decontamination, Decommissioning, Demolition, and Reuse (D3R) Program has evolved and become more focused and efficient. Historical approaches to project documentation, requirements, and drivers are discussed detailing key assumptions, oversight authority, and proj- ect approvals. Discussion of efforts to streamline the D3R project planning and preparation process include the in- corporation of the principles of graded approach, Total Quality Management, and the Observational Method (CH2MHILL April 1989).1 Process improvements were realized by clearly defining regulatory requirements for each phase of a project, establishing general guidance for the program and combining project-specific documents to eliminate redundant and unneeded information. Proc- ess improvements to cost, schedule, and quality are dis- cussed in detail for several projects

  14. Report: Follow-Up Report: EPA Proposes to Streamline the Review, Management and Disposal of Hazardous Waste Pharmaceuticals

    Science.gov (United States)

    Report #15-P-0260, August 19, 2015. EPA states that it intends to issue a proposed rule, Management Standards for Hazardous Waste, which will attempt to streamline the approach to managing and disposing of hazardous and nonhazardous pharmaceutical waste.

  15. Photogrammetric approach to automated checking of DTMs

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2005-01-01

    Geometrically accurate digital terrain models (DTMs) are essential for orthoimage production and many other applications. Collecting reference data or visual inspection are reliable but time consuming and therefore expensive methods for finding errors in DTMs. In this paper, a photogrammetric...... approach to automated checking and improving of DTMs is evaluated. Corresponding points in two overlapping orthoimages are found by means of area based matching. Provided the image orientation is correct, discovered displacements correspond to DTM errors. Improvements of the method regarding its...

  16. A new method for calculating volumetric sweeps efficiency using streamline simulation concepts

    International Nuclear Information System (INIS)

    Hidrobo, E A

    2000-01-01

    One of the purposes of reservoir engineering is to quantify the volumetric sweep efficiency for optimizing reservoir management decisions. The estimation of this parameter has always been a difficult task. Until now, sweep efficiency correlations and calculations have been limited to mostly homogeneous 2-D cases. Calculating volumetric sweep efficiency in a 3-D heterogeneous reservoir becomes difficult due to inherent complexity of multiple layers and arbitrary well configurations. In this paper, a new method for computing volumetric sweep efficiency for any arbitrary heterogeneity and well configuration is presented. The proposed method is based on Datta-Gupta and King's formulation of streamline time-of-flight (1995). Given the fact that the time-of-flight reflects the fluid front propagation at various times, then the connectivity in the time-of-flight represents a direct measure of the volumetric sweep efficiency. The proposed approach has been applied to synthetic as well as field examples. Synthetic examples are used to validate the volumetric sweep efficiency calculations using the streamline time-of-flight connectivity criterion by comparison with analytic solutions and published correlations. The field example, which illustrates the feasibility of the approach for large-scale field applications, is from the north Robertson unit, a low permeability carbonate reservoir in west Texas

  17. Dividing Streamline Formation Channel Confluences by Physical Modeling

    Directory of Open Access Journals (Sweden)

    Minarni Nur Trilita

    2010-02-01

    Full Text Available Confluence channels are often found in open channel network system and is the most important element. The incoming flow from the branch channel to the main cause various forms and cause vortex flow. Phenomenon can cause erosion of the side wall of the channel, the bed channel scour and sedimentation in the downstream confluence channel. To control these problems needed research into the current width of the branch channel. The incoming flow from the branch channel to the main channel flow bounded by a line distributors (dividing streamline. In this paper, the wide dividing streamline observed in the laboratory using a physical model of two open channels, a square that formed an angle of 30º. Observations were made with a variety of flow coming from each channel. The results obtained in the laboratory observation that the width of dividing streamline flow is influenced by the discharge ratio between the channel branch with the main channel. While the results of a comparison with previous studies showing that the observation in the laboratory is smaller than the results of previous research.

  18. Smartnotebook: A semi-automated approach to protein sequential NMR resonance assignments

    International Nuclear Information System (INIS)

    Slupsky, Carolyn M.; Boyko, Robert F.; Booth, Valerie K.; Sykes, Brian D.

    2003-01-01

    Complete and accurate NMR spectral assignment is a prerequisite for high-throughput automated structure determination of biological macromolecules. However, completely automated assignment procedures generally encounter difficulties for all but the most ideal data sets. Sources of these problems include difficulty in resolving correlations in crowded spectral regions, as well as complications arising from dynamics, such as weak or missing peaks, or atoms exhibiting more than one peak due to exchange phenomena. Smartnotebook is a semi-automated assignment software package designed to combine the best features of the automated and manual approaches. The software finds and displays potential connections between residues, while the spectroscopist makes decisions on which connection is correct, allowing rapid and robust assignment. In addition, smartnotebook helps the user fit chains of connected residues to the primary sequence of the protein by comparing the experimentally determined chemical shifts with expected shifts derived from a chemical shift database, while providing bookkeeping throughout the assignment procedure

  19. An automated approach for annual layer counting in ice cores

    Science.gov (United States)

    Winstrup, M.; Svensson, A.; Rasmussen, S. O.; Winther, O.; Steig, E.; Axelrod, A.

    2012-04-01

    The temporal resolution of some ice cores is sufficient to preserve seasonal information in the ice core record. In such cases, annual layer counting represents one of the most accurate methods to produce a chronology for the core. Yet, manual layer counting is a tedious and sometimes ambiguous job. As reliable layer recognition becomes more difficult, a manual approach increasingly relies on human interpretation of the available data. Thus, much may be gained by an automated and therefore objective approach for annual layer identification in ice cores. We have developed a novel method for automated annual layer counting in ice cores, which relies on Bayesian statistics. It uses algorithms from the statistical framework of Hidden Markov Models (HMM), originally developed for use in machine speech recognition. The strength of this layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on purely objective criteria for annual layer identification. With this methodology, it is possible to determine the most likely position of multiple layer boundaries in an entire section of ice core data at once. It provides a probabilistic uncertainty estimate of the resulting layer count, hence ensuring a proper treatment of ambiguous layer boundaries in the data. Furthermore multiple data series can be incorporated to be used at once, hence allowing for a full multi-parameter annual layer counting method similar to a manual approach. In this study, the automated layer counting algorithm has been applied to data from the NGRIP ice core, Greenland. The NGRIP ice core has very high temporal resolution with depth, and hence the potential to be dated by annual layer counting far back in time. In previous studies [Andersen et al., 2006; Svensson et al., 2008], manual layer counting has been carried out back to 60 kyr BP. A comparison between the counted annual layers based on the two approaches will be presented

  20. Exploration on Automated Software Requirement Document Readability Approaches

    OpenAIRE

    Chen, Mingda; He, Yao

    2017-01-01

    Context. The requirements analysis phase, as the very beginning of software development process, has been identified as a quite important phase in the software development lifecycle. Software Requirement Specification (SRS) is the output of requirements analysis phase, whose quality factors play an important role in the evaluation work. Readability is a quite important SRS quality factor, but there are few available automated approaches for readability measurement, because of the tight depend...

  1. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2013-07-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites, with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize the data range and variance of each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed, and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  2. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States.

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-02-01

    Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. A set of command line-based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion.

  3. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    Science.gov (United States)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  4. Stream-lined Gating Systems with Improved Yield - Dimensioning and Experimental Validation

    DEFF Research Database (Denmark)

    Tiedje, Niels Skat; Skov-Hansen, Søren Peter

    the two types of lay-outs are cast in production. It is shown that flow in the stream-lined lay-out is well controlled and that the quality of the castings is as at least equal to that of castings produced with a traditional lay-out. Further, the yield is improved by 4 % relative to a traditional lay-out.......The paper describes how a stream-lined gating system where the melt is confined and controlled during filling can be designed. Commercial numerical modelling software has been used to compare the stream-lined design with a traditional gating system. These results are confirmed by experiments where...

  5. Joint statistics and conditional mean strain rates of streamline segments

    International Nuclear Information System (INIS)

    Schaefer, P; Gampert, M; Peters, N

    2013-01-01

    Based on four different direct numerical simulations of turbulent flows with Taylor-based Reynolds numbers ranging from Re λ = 50 to 300 among which are two homogeneous isotropic decaying, one forced and one homogeneous shear flow, streamlines are identified and the obtained space curves are parameterized with the pseudo-time as well as the arclength. Based on local extrema of the absolute value of the velocity along the streamlines, the latter are partitioned into segments following Wang (2010 J. Fluid Mech. 648 183–203). Streamline segments are then statistically analyzed based on both parameterizations using the joint probability density function of the pseudo-time lag τ (arclength l, respectively) between and the velocity difference Δu at the extrema: P(τ,Δu), (P(l,Δu)). We distinguish positive and negative streamline segments depending on the sign of the velocity difference Δu. Differences as well as similarities in the statistical description for both parameterizations are discussed. In particular, it turns out that the normalized probability distribution functions (pdfs) (of both parameterizations) of the length of positive, negative and all segments assume a universal shape for all Reynolds numbers and flow types and are well described by a model derived in Schaefer P et al (2012 Phys. Fluids 24 045104). Particular attention is given to the conditional mean velocity difference at the ending points of the segments, which can be understood as a first-order structure function in the context of streamline segment analysis. It determines to a large extent the stretching (compression) of positive (negative) streamline segments and corresponds to the convective velocity in phase space in the transport model equation for the pdf. While based on the random sweeping hypothesis a scaling ∝ (u rms ετ) 1/3 is found for the parameterization based on the pseudo-time, the parameterization with the arclength l yields a much larger than expected l 1/3 scaling. A

  6. Application-Tailored I/O with Streamline

    NARCIS (Netherlands)

    de Bruijn, W.J.; Bos, H.J.; Bal, H.E.

    2011-01-01

    Streamline is a stream-based OS communication subsystem that spans from peripheral hardware to userspace processes. It improves performance of I/O-bound applications (such as webservers and streaming media applications) by constructing tailor-made I/O paths through the operating system for each

  7. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate thr...... important properties of our approach: platform independence, code integratability, and code readability. The evaluation shows that our approach can generate code for a wide range of platforms which is integratable and readable....

  8. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  9. Accelerated Logistics: Streamlining the Army's Supply Chain

    National Research Council Canada - National Science Library

    Wang, Mark

    2000-01-01

    ...) initiative, the Army has dramatically streamlined its supply chain, cutting order and ship times for repair parts by nearly two-thirds nationwide and over 75 percent at several of the major Forces Command (FORSCOM) installations...

  10. An automated approach to mapping corn from Landsat imagery

    Science.gov (United States)

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  11. Streamline-based microfluidic device

    Science.gov (United States)

    Tai, Yu-Chong (Inventor); Zheng, Siyang (Inventor); Kasdan, Harvey (Inventor)

    2013-01-01

    The present invention provides a streamline-based device and a method for using the device for continuous separation of particles including cells in biological fluids. The device includes a main microchannel and an array of side microchannels disposed on a substrate. The main microchannel has a plurality of stagnation points with a predetermined geometric design, for example, each of the stagnation points has a predetermined distance from the upstream edge of each of the side microchannels. The particles are separated and collected in the side microchannels.

  12. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Khoualdi, Kamel

    1994-01-01

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author) [fr

  13. Streamlining: Reducing costs and increasing STS operations effectiveness

    Science.gov (United States)

    Petersburg, R. K.

    1985-01-01

    The development of streamlining as a concept, its inclusion in the space transportation system engineering and operations support (STSEOS) contract, and how it serves as an incentive to management and technical support personnel is discussed. The mechanics of encouraging and processing streamlining suggestions, reviews, feedback to submitters, recognition, and how individual employee performance evaluations are used to motivation are discussed. Several items that were implemented are mentioned. Information reported and the methodology of determining estimated dollar savings are outlined. The overall effect of this activity on the ability of the McDonnell Douglas flight preparation and mission operations team to support a rapidly increasing flight rate without a proportional increase in cost is illustrated.

  14. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 484: Surface Debris, Waste Sites, and Burn Area, Tonopah Test Range, Nevada

    International Nuclear Information System (INIS)

    Bechel Nevada

    2004-01-01

    This Streamlined Approach for Environmental Restoration plan details the activities necessary to close Corrective Action Unit (CAU) 484: Surface Debris, Waste Sites, and Burn Area (Tonopah Test Range). CAU 484 consists of sites located at the Tonopah Test Range, Nevada, and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order. CAU 484 consists of the following six Corrective Action Sites: (1) CAS RG-52-007-TAML, Davis Gun Penetrator Test; (2) CAS TA-52-001-TANL, NEDS Detonation Area; (3) CAS TA-52-004-TAAL, Metal Particle Dispersion Test; (4) CAS TA-52-005-TAAL, Joint Test Assembly DU Sites; (5) CAS TA-52-006-TAPL, Depleted Uranium Site; and (6) CAS TA-54-001-TANL, Containment Tank and Steel Structure

  15. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States1

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-01-01

    Premise of the study: Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. Methods and Results: A set of command line–based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. Conclusions: WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion. PMID:26949580

  16. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation

    Directory of Open Access Journals (Sweden)

    Rahul Deb Das

    2016-11-01

    Full Text Available Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS. Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  17. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.

    Science.gov (United States)

    Das, Rahul Deb; Winter, Stephan

    2016-11-23

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  18. Automated Announcements of Approaching Emergency Vehicles

    Science.gov (United States)

    Bachelder, Aaron; Foster, Conrad

    2006-01-01

    Street intersections that are equipped with traffic lights would also be equipped with means for generating audible announcements of approaching emergency vehicles, according to a proposal. The means to generate the announcements would be implemented in the intersection- based subsystems of emergency traffic-light-preemption systems like those described in the two immediately preceding articles and in "Systems Would Preempt Traffic Lights for Emergency Vehicles" (NPO-30573), NASA Tech Briefs, Vol. 28, No. 10 (October 2004), page 36. Preempting traffic lights is not, by itself, sufficient to warn pedestrians at affected intersections that emergency vehicles are approaching. Automated visual displays that warn of approaching emergency vehicles can be helpful as a supplement to preemption of traffic lights, but experience teaches that for a variety of reasons, pedestrians often do not see such displays. Moreover, in noisy and crowded urban settings, the lights and sirens on emergency vehicles are often not noticed until a few seconds before the vehicles arrive. According to the proposal, the traffic-light preemption subsystem at each intersection would generate an audible announcement for example, emergency vehicle approaching, please clear intersection whenever a preemption was triggered. The subsystem would estimate the time of arrival of an approaching emergency vehicle by use of vehicle identity, position, and time data from one or more sources that could include units connected to traffic loops and/or transponders connected to diagnostic and navigation systems in participating emergency vehicles. The intersection-based subsystem would then start the announcement far enough in advance to enable pedestrians to leave the roadway before any emergency vehicles arrive.

  19. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    Le Go, Roland

    1972-01-01

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author) [fr

  20. Effectiveness of and obstacles to antibiotic streamlining to amoxicillin monotherapy in bacteremic pneumococcal pneumonia.

    Science.gov (United States)

    Blot, Mathieu; Pivot, Diane; Bourredjem, Abderrahmane; Salmon-Rousseau, Arnaud; de Curraize, Claire; Croisier, Delphine; Chavanet, Pascal; Binquet, Christine; Piroth, Lionel

    2017-09-01

    Antibiotic streamlining is pivotal to reduce the emergence of resistant bacteria. However, whether streamlining is frequently performed and safe in difficult situations, such as bacteremic pneumococcal pneumonia (BPP), has still to be assessed. All adult patients admitted to Dijon Hospital (France) from 2005 to 2013 who had BPP without complications, and were alive on the third day were enrolled. Clinical, biological, radiological, microbiological and therapeutic data were recorded. A first analysis was conducted to assess factors associated with being on amoxicillin on the third day. A second analysis, adjusting for a propensity score, was performed to determine whether 30-day mortality was associated with streamlining to amoxicillin monotherapy. Of the 196 patients hospitalized for BPP, 161 were still alive on the third day and were included in the study. Treatment was streamlined to amoxicillin in 60 patients (37%). Factors associated with not streamlining were severe pneumonia (OR 3.11, 95%CI [1.23-7.87]) and a first-line antibiotic combination (OR 3.08, 95%CI [1.34-7.09]). By contrast, starting with amoxicillin monotherapy correlated inversely with the risk of subsequent treatment with antibiotics other than amoxicillin (OR 0.06, 95%CI [0.01-0.30]). The Cox model adjusted for the propensity-score analysis showed that streamlining to amoxicillin during BPP was not significantly associated with a higher risk of 30-day mortality (HR 0.38, 95%CI [0.08-1.87]). Streamlining to amoxicillin is insufficiently implemented during BPP. This strategy is safe and potentially associated with ecological and economic benefits; therefore, it should be further encouraged, particularly when antibiotic combinations are started for severe pneumonia. Copyright © 2017. Published by Elsevier B.V.

  1. Photonomics: automation approaches yield economic aikido for photonics device manufacture

    Science.gov (United States)

    Jordan, Scott

    2002-09-01

    In the glory days of photonics, with exponentiating demand for photonics devices came exponentiating competition, with new ventures commencing deliveries seemingly weekly. Suddenly the industry was faced with a commodity marketplace well before a commodity cost structure was in place. Economic issues like cost, scalability, yield-call it all "Photonomics" -now drive the industry. Automation and throughput-optimization are obvious answers, but until now, suitable modular tools had not been introduced. Available solutions were barely compatible with typical transverse alignment tolerances and could not automate angular alignments of collimated devices and arrays. And settling physics served as the insoluble bottleneck to throughput and resolution advancement in packaging, characterization and fabrication processes. The industry has addressed these needs in several ways, ranging from special configurations of catalog motion devices to integrated microrobots based on a novel mini-hexapod configuration. This intriguing approach allows tip/tilt alignments to be automated about any point in space, such as a beam waist, a focal point, the cleaved face of a fiber, or the optical axis of a waveguide- ideal for MEMS packaging automation and array alignment. Meanwhile, patented new low-cost settling-enhancement technology has been applied in applications ranging from air-bearing long-travel stages to subnanometer-resolution piezo positioners to advance resolution and process cycle-times in sensitive applications such as optical coupling characterization and fiber Bragg grating generation. Background, examples and metrics are discussed, providing an up-to-date industry overview of available solutions.

  2. Automation of seismic network signal interpolation: an artificial intelligence approach

    International Nuclear Information System (INIS)

    Chiaruttini, C.; Roberto, V.

    1988-01-01

    After discussing the current status of the automation in signal interpretation from seismic networks, a new approach, based on artificial-intelligence tecniques, is proposed. The knowledge of the human expert analyst is examined, with emphasis on its objects, strategies and reasoning techniques. It is argued that knowledge-based systems (or expert systems) provide the most appropriate tools for designing an automatic system, modelled on the expert behaviour

  3. Streamlined approach for environmental restoration plan for corrective action unit 430, buried depleted uranium artillery round No. 1, Tonopah test range

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    This plan addresses actions necessary for the restoration and closure of Corrective Action Unit (CAU) No. 430, Buried Depleted Uranium (DU) Artillery Round No. 1 (Corrective Action Site No. TA-55-003-0960), a buried and unexploded W-79 Joint Test Assembly (JTA) artillery test projectile with high explosives (HE), at the U.S. Department of Energy, Nevada Operations Office (DOE/NV) Tonopah Test Range (TTR) in south-central Nevada. It describes activities that will occur at the site as well as the steps that will be taken to gather adequate data to obtain a notice of completion from Nevada Division of Environmental Protection (NDEP). This plan was prepared under the Streamlined Approach for Environmental Restoration (SAFER) concept, and it will be implemented in accordance with the Federal Facility Agreement and Consent Order (FFACO) and the Resource Conservation and Recovery Act (RCRA) Industrial Sites Quality Assurance Project Plan.

  4. Streamlined approach for environmental restoration plan for corrective action unit 430, buried depleted uranium artillery round No. 1, Tonopah test range

    International Nuclear Information System (INIS)

    1996-09-01

    This plan addresses actions necessary for the restoration and closure of Corrective Action Unit (CAU) No. 430, Buried Depleted Uranium (DU) Artillery Round No. 1 (Corrective Action Site No. TA-55-003-0960), a buried and unexploded W-79 Joint Test Assembly (JTA) artillery test projectile with high explosives (HE), at the U.S. Department of Energy, Nevada Operations Office (DOE/NV) Tonopah Test Range (TTR) in south-central Nevada. It describes activities that will occur at the site as well as the steps that will be taken to gather adequate data to obtain a notice of completion from Nevada Division of Environmental Protection (NDEP). This plan was prepared under the Streamlined Approach for Environmental Restoration (SAFER) concept, and it will be implemented in accordance with the Federal Facility Agreement and Consent Order (FFACO) and the Resource Conservation and Recovery Act (RCRA) Industrial Sites Quality Assurance Project Plan

  5. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    Science.gov (United States)

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  6. Streamlining the Bankability Process using International Standards

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Repins, Ingrid L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kelly, George [Sunset Technology, Mount Airy, MD; Ramu, Govind [SunPower, San Jose, California; Heinz, Matthias [TUV Rheinland, Cologne, Germany; Chen, Yingnan [CGC (China General Certification Center), Beijing; Wohlgemuth, John [PowerMark, Union Hall, VA; Lokanath, Sumanth [First Solar, Tempe, Arizona; Daniels, Eric [Suncycle USA, Frederick MD; Hsi, Edward [Swiss RE, Zurich, Switzerland; Yamamichi, Masaaki [RTS, Trumbull, CT

    2017-09-27

    NREL has supported the international efforts to create a streamlined process for documenting bankability and/or completion of each step of a PV project plan. IECRE was created for this purpose in 2014. This poster describes the goals, current status of this effort, and how individuals and companies can become involved.

  7. Streamline-concentration balance model for in-situ uranium leaching and site restoration

    International Nuclear Information System (INIS)

    Bommer, P.M.; Schechter, R.S.; Humenick, M.J.

    1981-03-01

    This work presents two computer models. One describes in-situ uranium leaching and the other describes post leaching site restoration. Both models use a streamline generator to set up the flow field over the reservoir. The leaching model then uses the flow data in a concentration balance along each streamline coupled with the appropriate reaction kinetics to calculate uranium production. The restoration model uses the same procedure except that binary cation exchange is used as the restoring mechanism along each streamline and leaching cation clean up is simulated. The mathematical basis for each model is shown in detail along with the computational schemes used. Finally, the two models have been used with several data sets to point out their capabilities and to illustrate important leaching and restoration parameters and schemes

  8. Streamline-concentration balance model for in situ uranium leaching and site restoration

    International Nuclear Information System (INIS)

    Bommer, P.M.

    1979-01-01

    This work presents two computer models. One describes in situ uranium leaching and the other describes post leaching site restoration. Both models use a streamline generator to set up the flow field over the reservoir. The leaching model then uses the flow data in a concentration balance along each streamline coupled with the appropriate reaction kinetics to calculate uranium production. The restoration model uses the same procedure ecept that binary cation exchange is used as the restoring mechanism along each streamline and leaching cation clean up is stimulated. The mathematical basis for each model is shown in detail along with the computational schemes used. Finally, the two models have been used with several data sets to point out their capabilities and to illustrate important leaching and restoration parameters and schemes

  9. An Integrated Systems Approach: A Description of an Automated Circulation Management System.

    Science.gov (United States)

    Seifert, Jan E.; And Others

    These bidding specifications describe requirements for a turn-key automated circulation system for the University of Oklahoma Libraries. An integrated systems approach is planned, and requirements are presented for various subsystems: acquisitions, fund accounting, reserve room, and bibliographic and serials control. Also outlined are hardware…

  10. Streamlining Research by Using Existing Tools

    OpenAIRE

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...

  11. 48 CFR 12.602 - Streamlined evaluation of offers.

    Science.gov (United States)

    2010-10-01

    ... offers. 12.602 Section 12.602 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... for Commercial Items 12.602 Streamlined evaluation of offers. (a) When evaluation factors are used... evaluation factors. (b) Offers shall be evaluated in accordance with the criteria contained in the...

  12. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    Science.gov (United States)

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  13. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 113: Reactor Maintenance, Assembly, and Disassembly Building Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    J. L. Smith

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the action necessary for the closure in place of Corrective Action Unit (CAU) 113 Area 25 Reactor Maintenance, Assembly, and Disassembly Facility (R-MAD). CAU 113 is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) (NDEP, 1996). The CAU is located in Area 25 of the Nevada Test Site (NTS) and consists of Corrective Action Site (CAS) 25-04-01, R-MAD Facility (Figures 1-2). This plan provides the methodology for closure in place of CAU 113. The site contains radiologically impacted and hazardous material. Based on preassessment field work, there is sufficient process knowledge to close in place CAU 113 using the SAFER process. At a future date when funding becomes available, the R-MAD Building (25-3110) will be demolished and inaccessible radiologic waste will be properly disposed in the Area 3 Radiological Waste Management Site (RWMS).

  14. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 113: Reactor Maintenance, Assembly, and Disassembly Building Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    Smith, J. L.

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the action necessary for the closure in place of Corrective Action Unit (CAU) 113 Area 25 Reactor Maintenance, Assembly, and Disassembly Facility (R-MAD). CAU 113 is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) (NDEP, 1996). The CAU is located in Area 25 of the Nevada Test Site (NTS) and consists of Corrective Action Site (CAS) 25-04-01, R-MAD Facility (Figures 1-2). This plan provides the methodology for closure in place of CAU 113. The site contains radiologically impacted and hazardous material. Based on preassessment field work, there is sufficient process knowledge to close in place CAU 113 using the SAFER process. At a future date when funding becomes available, the R-MAD Building (25-3110) will be demolished and inaccessible radiologic waste will be properly disposed in the Area 3 Radiological Waste Management Site (RWMS)

  15. Case studies in geographic information systems for environmental streamlining

    Science.gov (United States)

    2012-05-31

    This 2012 summary report addresses the current use of geographic information systems (GIS) and related technologies by State Departments of Transportation (DOTs) for environmental streamlining and stewardship, particularly in relation to the National...

  16. PASA - A Program for Automated Protein NMR Backbone Signal Assignment by Pattern-Filtering Approach

    International Nuclear Information System (INIS)

    Xu Yizhuang; Wang Xiaoxia; Yang Jun; Vaynberg, Julia; Qin Jun

    2006-01-01

    We present a new program, PASA (Program for Automated Sequential Assignment), for assigning protein backbone resonances based on multidimensional heteronuclear NMR data. Distinct from existing programs, PASA emphasizes a per-residue-based pattern-filtering approach during the initial stage of the automated 13 C α and/or 13 C β chemical shift matching. The pattern filter employs one or multiple constraints such as 13 C α /C β chemical shift ranges for different amino acid types and side-chain spin systems, which helps to rule out, in a stepwise fashion, improbable assignments as resulted from resonance degeneracy or missing signals. Such stepwise filtering approach substantially minimizes early false linkage problems that often propagate, amplify, and ultimately cause complication or combinatorial explosion of the automation process. Our program (http://www.lerner.ccf.org/moleccard/qin/) was tested on four representative small-large sized proteins with various degrees of resonance degeneracy and missing signals, and we show that PASA achieved the assignments efficiently and rapidly that are fully consistent with those obtained by laborious manual protocols. The results demonstrate that PASA may be a valuable tool for NMR-based structural analyses, genomics, and proteomics

  17. Main approaches to automation of management systems in the coal industry. [Czechoslovakia

    Energy Technology Data Exchange (ETDEWEB)

    Zafouk, P; Dlabaya, I; Frous, S

    1980-01-01

    The main approaches to automation of management systems in the coal industry of Czechoslovakia are enumerated. Organizational structure of the branch and concern form of organization. Complex improvement of management system and source of continued development of the branch. Automated control systems, an integral part of the complex management system. Primary problem - automation in the area of design of the information system. Centralization of methodological management of operations in the area of control system development. Unified approach to breakdown of control system into branches. Organizational support of the development of the control system, problems solved by the department of control system development of the Ministry, main department of control system development of the Research Institute, departmental committees in the branch. The use of principles of control system development in the Ostravsko-Karvinsk mining concern is demonstrated. Preparation for development of the control system in the concern: elaboration of concepts and programs of control system development. Design of control system of the concern. Control system of an enterprise in the concern as an integral control system. Support of control system development in organizations, participants in this process, their jurisdiction and obligations. Annual plans of control system development. Centralized subsystems and enterprises. Methods of coordination of the process of improvement of control and support of the harmony of decisions made. Technical support of control system development, construction of a unified network of computer centers in enterprises with combined resources of computer technology.

  18. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  19. Mean streamline analysis for performance prediction of cross-flow fans

    International Nuclear Information System (INIS)

    Kim, Jae Won; Oh, Hyoung Woo

    2004-01-01

    This paper presents the mean streamline analysis using the empirical loss correlations for performance prediction of cross-flow fans. Comparison of overall performance predictions with test data of a cross-flow fan system with a simplified vortex wall scroll casing and with the published experimental characteristics for a cross-flow fan has been carried out to demonstrate the accuracy of the proposed method. Predicted performance curves by the present mean streamline analysis agree well with experimental data for two different cross-flow fans over the normal operating conditions. The prediction method presented herein can be used efficiently as a tool for the preliminary design and performance analysis of general-purpose cross-flow fans

  20. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  1. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  2. Robotic automation of medication-use management.

    Science.gov (United States)

    Enright, S M

    1993-11-01

    In the October 1993 issue of Physician Assistant, we published "Robots for Health Care," the first of two articles on the medical applications of robotics. That article discussed ways in which robots could help patients with manipulative disabilities to perform activities of daily living and hold paid employment; transfer patients from bed to chair and back again; add precision to the most exacting surgical procedures; and someday carry out diagnostic and therapeutic techniques from within the human body. This month, we are pleased to offer an article by Sharon Enright, an authority on pharmacy operations, who considers how an automated medication-management system that makes use of bar-code technology is capable of streamlining drug dispensing, controlling safety, increasing cost-effectiveness, and ensuring accurate and complete record-keeping.

  3. An Evaluation of the Acquisition Streamlining Methods at the Fleet and Industrial Supply Center Pearl Harbor Hawaii

    National Research Council Canada - National Science Library

    Henry, Mark

    1999-01-01

    ...) Pearl Harbor's implementation of acquisition streamlining initiatives and recommends viable methods of streamlining the acquisition process at FISC Pearl Harbor and other Naval Supply Systems Command...

  4. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    Science.gov (United States)

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  5. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    Directory of Open Access Journals (Sweden)

    Markus Niklasson

    2015-01-01

    Full Text Available The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  6. Geologic storage of carbon dioxide and enhanced oil recovery. I. Uncertainty quantification employing a streamline based proxy for reservoir flow simulation

    International Nuclear Information System (INIS)

    Kovscek, A.R.; Wang, Y.

    2005-01-01

    Carbon dioxide (CO 2 ) is already injected into a limited class of reservoirs for oil recovery purposes; however, the engineering design question for simultaneous oil recovery and storage of anthropogenic CO 2 is significantly different from that of oil recovery alone. Currently, the volumes of CO 2 injected solely for oil recovery are minimized due to the purchase cost of CO 2 . If and when CO 2 emissions to the atmosphere are managed, it will be necessary to maximize simultaneously both economic oil recovery and the volumes of CO 2 emplaced in oil reservoirs. This process is coined 'cooptimization'. This paper proposes a work flow for cooptimization of oil recovery and geologic CO 2 storage. An important component of the work flow is the assessment of uncertainty in predictions of performance. Typical methods for quantifying uncertainty employ exhaustive flow simulation of multiple stochastic realizations of the geologic architecture of a reservoir. Such approaches are computationally intensive and thereby time consuming. An analytic streamline based proxy for full reservoir simulation is proposed and tested. Streamline trajectories represent the three-dimensional velocity field during multiphase flow in porous media and so are useful for quantifying the similarity and differences among various reservoir models. The proxy allows rational selection of a representative subset of equi-probable reservoir models that encompass uncertainty with respect to true reservoir geology. The streamline approach is demonstrated to be thorough and rapid

  7. The benefits of life cycle inventory parametric models in streamlining data collection. A case study in the wooden pallet sector

    DEFF Research Database (Denmark)

    Niero, Monia; Di Felice, F.; Ren, J.

    2014-01-01

    LCA methodology is time and resource consuming particularly when it comes to data collection and handling, therefore companies, particularly Small and Medium Enterprises (SMEs), are inclined to use streamlined approaches to shorten the resource-consuming life cycle inventory (LCI) phase. An effec......LCA methodology is time and resource consuming particularly when it comes to data collection and handling, therefore companies, particularly Small and Medium Enterprises (SMEs), are inclined to use streamlined approaches to shorten the resource-consuming life cycle inventory (LCI) phase...... study of a SME in the wooden pallet sector, investigating to what extent the use of parametric LCI models can be beneficial both in evaluating the environmental impacts of similar products and in providing a preliminary assessment of the potential environmental impacts of new products. We developed...... an LCI parametric model describing the LCI of a range of wooden pallets and tested its effectiveness with a reference product, namely a non-reversible pallet with four-way blocks. The identified parameters refer to the technical characteristics of the product system, e.g. the number and dimension...

  8. Analysis of Streamline Separation at Infinity Using Time-Discrete Markov Chains.

    Science.gov (United States)

    Reich, W; Scheuermann, G

    2012-12-01

    Existing methods for analyzing separation of streamlines are often restricted to a finite time or a local area. In our paper we introduce a new method that complements them by allowing an infinite-time-evaluation of steady planar vector fields. Our algorithm unifies combinatorial and probabilistic methods and introduces the concept of separation in time-discrete Markov-Chains. We compute particle distributions instead of the streamlines of single particles. We encode the flow into a map and then into a transition matrix for each time direction. Finally, we compare the results of our grid-independent algorithm to the popular Finite-Time-Lyapunov-Exponents and discuss the discrepancies.

  9. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  10. Automated Generation of OCL Constraints: NL based Approach vs Pattern Based Approach

    Directory of Open Access Journals (Sweden)

    IMRAN SARWAR BAJWA

    2017-04-01

    Full Text Available This paper presents an approach used for automated generations of software constraints. In this model, the SBVR (Semantics of Business Vocabulary and Rules based semi-formal representation is obtained from the syntactic and semantic analysis of a NL (Natural Language (such as English sentence. A SBVR representation is easy to translate to other formal languages as SBVR is based on higher-order logic like other formal languages such as OCL (Object Constraint Language. The proposed model endows with a systematic and powerful system of incorporating NL knowledge on the formal languages. A prototype is constructed in Java (an Eclipse plug-in as a proof of the concept. The performance was tested for a few sample texts taken from existing research thesis reports and books

  11. Analysis Streamlining in ATLAS

    CERN Document Server

    Heinrich, Lukas; The ATLAS collaboration

    2018-01-01

    We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...

  12. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Directory of Open Access Journals (Sweden)

    August Betzler

    2014-08-01

    Full Text Available Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  13. A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-01-01

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004

  14. A holistic approach to ZigBee performance enhancement for home automation networks.

    Science.gov (United States)

    Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep

    2014-08-14

    Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.

  15. Adaptation : A Partially Automated Approach

    NARCIS (Netherlands)

    Manjing, Tham; Bukhsh, F.A.; Weigand, H.

    2014-01-01

    This paper showcases the possibility of creating an adaptive auditing system. Adaptation in an audit environment need human intervention at some point. Based on a case study this paper focuses on automation of adaptation process. It is divided into solution design and validation parts. The artifact

  16. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 574: Neptune, Nevada National Security Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Environmental Restoration

    2011-08-31

    This Streamlined Approach for Environmental Restoration (SAFER) Plan identifies the activities required for closure of Corrective Action Unit (CAU) 574, Neptune. CAU 574 is included in the Federal Facility Agreement and Consent Order (FFACO) (1996 [as amended March 2010]) and consists of the following two Corrective Action Sites (CASs) located in Area 12 of the Nevada National Security Site: (1) CAS 12-23-10, U12c.03 Crater (Neptune); (2) CAS 12-45-01, U12e.05 Crater (Blanca). This plan provides the methodology for the field activities that will be performed to gather the necessary information for closure of the two CASs. There is sufficient information and process knowledge regarding the expected nature and extent of potential contaminants to recommend closure of CAU 574 using the SAFER process. Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, field screening, analytical results, the results of the data quality objective (DQO) process (Section 3.0), and an evaluation of corrective action alternatives (Appendix B), closure in place with administrative controls is the expected closure strategy for CAU 574. Additional information will be obtained by conducting a field investigation to verify and support the expected closure strategy and provide a defensible recommendation that no further corrective action is necessary. This will be presented in a Closure Report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval.

  17. Approach to plant automation with evolving technology

    International Nuclear Information System (INIS)

    White, J.D.

    1989-01-01

    The US Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the-art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants. 24 refs

  18. Approach to plant automation with evolving technology

    International Nuclear Information System (INIS)

    White, J.D.

    1989-01-01

    This paper reports that the U.S. Department of Energy has provided support to Oak Ridge National Laboratory in order to pursue research leading to advanced, automated control of new innovative liquid-metal-cooled nuclear power plants. The purpose of this effort is to conduct research that will help to ensure improved operability, reliability, and safety for advanced LMRs. The plan adopted to achieve these program goals in an efficient and timely manner consists of utilizing, and advancing where required, state-of-the art controls technology through close interaction with other national laboratories, universities, industry and utilities. A broad range of applications for the control systems strategies and the design environment developed in the course of this program is likely. A natural evolution of automated control in nuclear power plants is envisioned by ORNL to be a phased transition from today's situation of some analog control at the subsystem level with significant operator interaction to the future capability for completely automated digital control with operator supervision. The technical accomplishments provided by this program will assist the industry to accelerate this transition and provide greater economy and safety. The development of this transition to advanced, automated control system designs is expected to have extensive benefits in reduced operating costs, fewer outages, enhanced safety, improved licensability, and improved public acceptance for commercial nuclear power plants

  19. An approach to evaluate task allocation between operators and automation with respect to safety of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Macwan, A.; Wei, Z.G.; Wieringa, P.A.

    1994-01-01

    Even though the use of automation is increasing in complex systems, its effect on safety cannot be systematically analyzed using current techniques. Of particular interest is task allocation between operators and automation. In evaluating its effect on safety, a quantitative definition of degree of automation (doA) is used. The definition of doA accounts for the effect of task on safety, irrespective of whether the task is carried out by operator or automation. Also included is the indirect effect due to the change in workload perceived by the operator. This workload is translated into stress which affects operator performance, expressed as human error probability, and subsequently, safety. The approach can be useful for evaluation of existing task allocation schemes as well as in making decisions about task allocation between operators and automation. (author). 13 refs, 1 fig

  20. The Effects of Propulsive Jetting on Drag of a Streamlined body

    Science.gov (United States)

    Krieg, Michael; Mohseni, Kamran

    2017-11-01

    Recently an abundance of bioinspired underwater vehicles have emerged to leverage eons of evolution. Our group has developed a propulsion technique inspired by jellyfish and squid. Propulsive jets are generated by ingesting and expelling water from a flexible internal cavity. We have demonstrated thruster capabilities for maneuvering on AUV platforms, where the internal thruster geometry minimized forward drag; however, such a setup cannot characterize propulsive efficiency. Therefore, we created a new streamlined vehicle platform that produces unsteady jets for forward propulsion rather than maneuvering. The streamlined jetting body is placed in a water tunnel and held stationary while jetting frequency and background flow velocity are varied. For each frequency/velocity pair the flow field is measured around the surface and in the wake using PIV. Using the zero jetting frequency as a baseline for each background velocity, the passive body drag is related to the velocity distribution. For cases with active jetting the drag and jetting forces are estimated from the velocity field and compared to the passive case. For this streamlined body, the entrainment of surrounding flow into the propulsive jet can reduce drag forces in addition to the momentum transfer of the jet itself. Office of Naval Research.

  1. Streamline Patterns and their Bifurcations near a wall with Navier slip Boundary Conditions

    DEFF Research Database (Denmark)

    Tophøj, Laust; Møller, Søren; Brøns, Morten

    2006-01-01

    We consider the two-dimensional topology of streamlines near a surface where the Navier slip boundary condition applies. Using transformations to bring the streamfunction in a simple normal form, we obtain bifurcation diagrams of streamline patterns under variation of one or two external parameters....... Topologically, these are identical with the ones previously found for no-slip surfaces. We use the theory to analyze the Stokes flow inside a circle, and show how it can be used to predict new bifurcation phenomena. ©2006 American Institute of Physics...

  2. Automated drumlin shape and volume estimation using high resolution LiDAR imagery (Curvature Based Relief Separation): A test from the Wadena Drumlin Field, Minnesota

    Science.gov (United States)

    Yu, Peter; Eyles, Nick; Sookhan, Shane

    2015-10-01

    Resolving the origin(s) of drumlins and related megaridges in areas of megascale glacial lineations (MSGL) left by paleo-ice sheets is critical to understanding how ancient ice sheets interacted with their sediment beds. MSGL is now linked with fast-flowing ice streams but there is a broad range of erosional and depositional models. Further progress is reliant on constraining fluxes of subglacial sediment at the ice sheet base which in turn is dependent on morphological data such as landform shape and elongation and most importantly landform volume. Past practice in determining shape has employed a broad range of geomorphological methods from strictly visualisation techniques to more complex semi-automated and automated drumlin extraction methods. This paper reviews and builds on currently available visualisation, semi-automated and automated extraction methods and presents a new, Curvature Based Relief Separation (CBRS) technique; for drumlin mapping. This uses curvature analysis to generate a base level from which topography can be normalized and drumlin volume can be derived. This methodology is tested using a high resolution (3 m) LiDAR elevation dataset from the Wadena Drumlin Field, Minnesota, USA, which was constructed by the Wadena Lobe of the Laurentide Ice Sheet ca. 20,000 years ago and which as a whole contains 2000 drumlins across an area of 7500 km2. This analysis demonstrates that CBRS provides an objective and robust procedure for automated drumlin extraction. There is strong agreement with manually selected landforms but the method is also capable of resolving features that were not detectable manually thereby considerably expanding the known population of streamlined landforms. CBRS provides an effective automatic method for visualisation of large areas of the streamlined beds of former ice sheets and for modelling sediment fluxes below ice sheets.

  3. Evaluation of two streamlined life cycle assessment methods

    International Nuclear Information System (INIS)

    Hochschomer, Elisabeth; Finnveden, Goeran; Johansson, Jessica

    2002-02-01

    Two different methods for streamlined life cycle assessment (LCA) are described: the MECO-method and SLCA. Both methods are tested on an already made case-study on cars fuelled with petrol or ethanol, and electric cars with electricity produced from hydro power or coal. The report also contains some background information on LCA and streamlined LCA, and a deschption of the case study used. The evaluation of the MECO and SLCA-methods are based on a comparison of the results from the case study as well as practical aspects. One conclusion is that the SLCA-method has some limitations. Among the limitations are that the whole life-cycle is not covered, it requires quite a lot of information and there is room for arbitrariness. It is not very flexible instead it difficult to develop further. We are therefore not recommending the SLCA-method. The MECO-method does in comparison show several attractive features. It is also interesting to note that the MECO-method produces information that is complementary compared to a more traditional quantitative LCA. We suggest that the MECO method needs some further development and adjustment to Swedish conditions

  4. Study of streamline flow in the portal system

    International Nuclear Information System (INIS)

    Atkins, H.L.; Deitch, J.S.; Oster, Z.H.; Perkes, E.A.

    1985-01-01

    The study was undertaken to determine if streamline flow occurs in the portal vein, thus separating inflow from the superior mesenteric artery (SMA) and the inferior mesenteric artery. Previously published data on this subject is inconsistent. Patients undergoing abdominal angiography received two administrations of Tc-99m sulfur colloid, first via the SMA during angiography and, after completion of the angiographic procedure, via a peripheral vein (IV). Anterior images of the liver were recorded over a three minute acquisition before and after the IV injection without moving the patient. The image from the SMA injection was subtracted from the SMA and IV image to provide a pure IV image. Analysis of R to L ratios for selected regions of interest as well as whole lobes was carried out and the shift of R to L (SMA to IV) determined. Six patients had liver metastases from the colon, four had cirrhosis and four had no known liver disease. The shift in the ratio was highly variable without a consistent pattern. Large changes in some patients could be attributed to hepatic artery flow directed to metastases. No consistent evidence for streamlining of portal flow was discerned

  5. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  6. Automated Assessment in Massive Open Online Courses

    Science.gov (United States)

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  7. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 415: Project 57 No. 1 Plutonium Dispersion (NTTR), Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick; Burmeister, Mark

    2014-04-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 415, Project 57 No. 1 Plutonium Dispersion (NTTR). CAU 415 is located on Range 4808A of the Nevada Test and Training Range (NTTR) and consists of one corrective action site: NAFR-23-02, Pu Contaminated Soil. The CAU 415 site consists of the atmospheric release of radiological contaminants to surface soil from the Project 57 safety experiment conducted in 1957. The safety experiment released plutonium (Pu), uranium (U), and americium (Am) to the surface soil over an area of approximately 1.9 square miles. This area is currently fenced and posted as a radiological contamination area. Vehicles and debris contaminated by the experiment were subsequently buried in a disposal trench within the surface-contaminated, fenced area and are assumed to have released radiological contamination to subsurface soils. Potential source materials in the form of pole-mounted electrical transformers were also identified at the site and will be removed as part of closure activities.

  8. Automating Hyperspectral Data for Rapid Response in Volcanic Emergencies

    Science.gov (United States)

    Davies, Ashley G.; Doubleday, Joshua R.; Chien, Steve A.

    2013-01-01

    In a volcanic emergency, time is of the essence. It is vital to quantify eruption parameters (thermal emission, effusion rate, location of activity) and distribute this information as quickly as possible to decision-makers in order to enable effective evaluation of eruption-related risk and hazard. The goal of this work was to automate and streamline processing of spacecraft hyperspectral data, automate product generation, and automate distribution of products. Visible and Short-Wave Infrared Images of volcanic eruption in Iceland in May 2010." class="caption" align="right">The software rapidly processes hyperspectral data, correcting for incident sunlight where necessary, and atmospheric transmission; detects thermally anomalous pixels; fits data with model black-body thermal emission spectra to determine radiant flux; calculates atmospheric convection thermal removal; and then calculates total heat loss. From these results, an estimation of effusion rate is made. Maps are generated of thermal emission and location (see figure). Products are posted online, and relevant parties notified. Effusion rate data are added to historical record and plotted to identify spikes in activity for persistently active eruptions. The entire process from start to end is autonomous. Future spacecraft, especially those in deep space, can react to detection of transient processes without the need to communicate with Earth, thus increasing science return. Terrestrially, this removes the need for human intervention.

  9. Streamlining environmental product declarations: a stage model

    Science.gov (United States)

    Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael

    2001-02-01

    General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development

  10. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  11. Streamlined Approach for Environmental Restoration Work Plan for Corrective Action Unit 461: Joint Test Assembly Sites and Corrective Action Unit 495: Unconfirmed Joint Test Assembly Sites Tonopah Test Range, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Jeff Smith

    1998-08-01

    This Streamlined Approach for Environmental Restoration plan addresses the action necessary for the clean closure of Corrective Action Unit 461 (Test Area Joint Test Assembly Sites) and Corrective Action Unit 495 (Unconfirmed Joint Test Assembly Sites). The Corrective Action Units are located at the Tonopah Test Range in south central Nevada. Closure for these sites will be completed by excavating and evaluating the condition of each artillery round (if found); detonating the rounds (if necessary); excavating the impacted soil and debris; collecting verification samples; backfilling the excavations; disposing of the impacted soil and debris at an approved low-level waste repository at the Nevada Test Site

  12. A Crowd-Based Intelligence Approach for Measurable Security, Privacy, and Dependability in Internet of Automated Vehicles with Vehicular Fog

    Directory of Open Access Journals (Sweden)

    Ashish Rauniyar

    2018-01-01

    Full Text Available With the advent of Internet of things (IoT and cloud computing technologies, we are in the era of automation, device-to-device (D2D and machine-to-machine (M2M communications. Automated vehicles have recently gained a huge attention worldwide, and it has created a new wave of revolution in automobile industries. However, in order to fully establish automated vehicles and their connectivity to the surroundings, security, privacy, and dependability always remain a crucial issue. One cannot deny the fact that such automatic vehicles are highly vulnerable to different kinds of security attacks. Also, today’s such systems are built from generic components. Prior analysis of different attack trends and vulnerabilities enables us to deploy security solutions effectively. Moreover, scientific research has shown that a “group” can perform better than individuals in making decisions and predictions. Therefore, this paper deals with the measurable security, privacy, and dependability of automated vehicles through the crowd-based intelligence approach that is inspired from swarm intelligence. We have studied three use case scenarios of automated vehicles and systems with vehicular fog and have analyzed the security, privacy, and dependability metrics of such systems. Our systematic approaches to measuring efficient system configuration, security, privacy, and dependability of automated vehicles are essential for getting the overall picture of the system such as design patterns, best practices for configuration of system, metrics, and measurements.

  13. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 575: Area 15 Miscellaneous Sites, Nevada National Security Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States)

    2014-12-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 575, Area 15 Miscellaneous Sites, identified in the Federal Facility Agreement and Consent Order (FFACO). CAU 575 comprises the following four corrective action sites (CASs) located in Area 15 of the Nevada National Security Site: 15-19-02, Waste Burial Pit, 15-30-01, Surface Features at Borehole Sites, 15-64-01, Decontamination Area, 15-99-03, Aggregate Plant This plan provides the methodology for field activities needed to gather the necessary information for closing each CAS. There is sufficient information and process knowledge from historical documentation and investigations of similar sites regarding the expected nature and extent of potential contaminants to recommend closure of CAU 575 using the SAFER process. Additional information will be obtained by conducting a field investigation to document and verify the adequacy of existing information, to affirm the predicted corrective action decisions, and to provide sufficient data to implement the corrective actions. This will be presented in a closure report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval.

  14. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  15. Large-scale renewable energy project barriers: Environmental impact assessment streamlining efforts in Japan and the EU

    International Nuclear Information System (INIS)

    Schumacher, Kim

    2017-01-01

    Environmental Impact Assessment (EIA) procedures have been identified as a major barrier to renewable energy (RE) development with regards to large-scale projects (LS-RE). However EIA laws have also been neglected by many decision-makers who have been underestimating its impact on RE development and the stifling potential they possess. As a consequence, apart from acknowledging the shortcomings of the systems currently in place, few governments momentarily have concrete plans to reform their EIA laws. By looking at recent EIA streamlining efforts in two industrialized regions that underwent major transformations in their energy sectors, this paper attempts to assess how such reform efforts can act as a means to support the balancing of environmental protection and climate change mitigation with socio-economic challenges. Thereby this paper fills this intellectual void by identifying the strengths and weaknesses of the Japanese EIA law by contrasting it with the recently revised EIA Directive of the European Union (EU). This enables the identification of the regulatory provisions that impact RE development the most and the determination of how structured EIA law reforms would affect domestic RE project development. The main focus lies on the evaluation of regulatory streamlining efforts in the Japanese and EU contexts through the application of a mixed-methods approach, consisting of in-depth literary and legal reviews, followed by a comparative analysis and a series of semi-structured interviews. Highlighting several legal inconsistencies in combination with the views of EIA professionals, academics and law- and policymakers, allowed for a more comprehensive assessment of what streamlining elements of the reformed EU EIA Directive and the proposed Japanese EIA framework modifications could either promote or stifle further RE deployment. - Highlights: •Performs an in-depth review of EIA reforms in OECD territories •First paper to compare Japan and the European

  16. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  17. Southern Ocean overturning across streamlines in an eddying simulation of the Antarctic Circumpolar Current

    Directory of Open Access Journals (Sweden)

    A. M. Treguier

    2007-12-01

    Full Text Available An eddying global model is used to study the characteristics of the Antarctic Circumpolar Current (ACC in a streamline-following framework. Previous model-based estimates of the meridional circulation were calculated using zonal averages: this method leads to a counter-intuitive poleward circulation of the less dense waters, and underestimates the eddy effects. We show that on the contrary, the upper ocean circulation across streamlines agrees with the theoretical view: an equatorward mean flow partially cancelled by a poleward eddy mass flux. Two model simulations, in which the buoyancy forcing above the ACC changes from positive to negative, suggest that the relationship between the residual meridional circulation and the surface buoyancy flux is not as straightforward as assumed by the simplest theoretical models: the sign of the residual circulation cannot be inferred from the surface buoyancy forcing only. Among the other processes that likely play a part in setting the meridional circulation, our model results emphasize the complex three-dimensional structure of the ACC (probably not well accounted for in streamline-averaged, two-dimensional models and the distinct role of temperature and salinity in the definition of the density field. Heat and salt transports by the time-mean flow are important even across time-mean streamlines. Heat and salt are balanced in the ACC, the model drift being small, but the nonlinearity of the equation of state cannot be ignored in the density balance.

  18. Streamlining genomes: toward the generation of simplified and stabilized microbial systems

    NARCIS (Netherlands)

    Leprince, A.; Passel, van M.W.J.; Martins Dos Santos, V.A.P.

    2012-01-01

    At the junction between systems and synthetic biology, genome streamlining provides a solid foundation both for increased understanding of cellular circuitry, and for the tailoring of microbial chassis towards innovative biotechnological applications. Iterative genomic deletions (targeted and

  19. Unique encoding for streamline topologies of incompressible and inviscid flows in multiply connected domains

    Energy Technology Data Exchange (ETDEWEB)

    Sakajo, T [Department of Mathematics, Kyoto University, Kitashirakawa Oiwake-cho, Sakyo-ku, Kyoto 606-8502 (Japan); Sawamura, Y; Yokoyama, T, E-mail: sakajo@math.kyoto-u.ac.jp [JST CREST, Kawaguchi, Saitama 332-0012 (Japan)

    2014-06-01

    This study considers the flow of incompressible and inviscid fluid in two-dimensional multiply connected domains. For such flows, encoding algorithms to assign a unique sequence of words to any structurally stable streamline topology based on the theory presented by Yokoyama and Sakajo (2013 Proc. R. Soc. A 469 20120558) are proposed. As an application, we utilize the algorithms to characterize the evolution of an incompressible and viscid flow around a flat plate inclined to the uniform flow in terms of the change of the word representations for their instantaneous streamline topologies. (papers)

  20. West Virginia peer exchange : streamlining highway safety improvement program project delivery.

    Science.gov (United States)

    2015-01-01

    The West Virginia Division of Highways (WV DOH) hosted a Peer Exchange to share information and experiences : for streamlining Highway Safety Improvement Program (HSIP) project delivery. The event was held September : 22 to 23, 2014 in Charleston, We...

  1. Development of a web-based tool for automated processing and cataloging of a unique combinatorial drug screen.

    Science.gov (United States)

    Dalecki, Alex G; Wolschendorf, Frank

    2016-07-01

    Facing totally resistant bacteria, traditional drug discovery efforts have proven to be of limited use in replenishing our depleted arsenal of therapeutic antibiotics. Recently, the natural anti-bacterial properties of metal ions in synergy with metal-coordinating ligands have shown potential for generating new molecule candidates with potential therapeutic downstream applications. We recently developed a novel combinatorial screening approach to identify compounds with copper-dependent anti-bacterial properties. Through a parallel screening technique, the assay distinguishes between copper-dependent and independent activities against Mycobacterium tuberculosis with hits being defined as compounds with copper-dependent activities. These activities must then be linked to a compound master list to process and analyze the data and to identify the hit molecules, a labor intensive and mistake-prone analysis. Here, we describe a software program built to automate this analysis in order to streamline our workflow significantly. We conducted a small, 1440 compound screen against M. tuberculosis and used it as an example framework to build and optimize the software. Though specifically adapted to our own needs, it can be readily expanded for any small- to medium-throughput screening effort, parallel or conventional. Further, by virtue of the underlying Linux server, it can be easily adapted for chemoinformatic analysis of screens through packages such as OpenBabel. Overall, this setup represents an easy-to-use solution for streamlining processing and analysis of biological screening data, as well as offering a scaffold for ready functionality expansion. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Streamlined approach for environmental restoration work plan for Corrective Action Unit 126: Closure of aboveground storage tanks, Nevada Test Site, Nevada. Revision 1

    International Nuclear Information System (INIS)

    1998-07-01

    This plan addresses the closure of several aboveground storage tanks in Area 25 of the Nevada Test Site. The unit is currently identified as Corrective Action Unit 126 in the Federal Facility Agreement and Consent Order and is listed as having six Corrective Action Sites. This plan addresses the Streamlined Approach for Environmental Restoration closure for five of the six sites. Four of the CASs are located at the Engine Test Stand complex and one is located in the Central Support Area. The sites consist of aboveground tanks, two of which were used to store diesel fuel and one stored Nalcool (an antifreeze mixture). The remaining tanks were used as part of a water demineralization process and stored either sulfuric acid or sodium hydroxide, and one was used as a charcoal adsorption furnace. Closure will be completed by removal of the associated piping, tank supports and tanks using a front end loader, backhoe, and/or crane. When possible, the tanks will be salvaged as scrap metal. The piping that is not removed will be sealed using a cement grout

  3. Lightroom 5 streamlining your digital photography process

    CERN Document Server

    Sylvan, Rob

    2014-01-01

    Manage your images with Lightroom and this beautifully illustrated guide Image management can soak up huge amounts of a photographer's time, but help is on hand. This complete guides teaches you how to use Adobe Lightroom 5 to import, manage, edit, and showcase large quantities of images with impressive results. The authors, both professional photographers and Lightroom experts, walk you through step by step, demonstrating real-world techniques as well as a variety of practical tips, tricks, and shortcuts that save you time. Streamline image management tasks like a pro, and get back to doing

  4. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However...... (and reportedly one or two critics) can engage one another on several agreed questions about such frameworks. The goal is to aid non-aligned practitioners in choosing between alternative frameworks for their human-automation interaction design challenges....

  5. An agent-oriented approach to automated mission operations

    Science.gov (United States)

    Truszkowski, Walt; Odubiyi, Jide

    1994-01-01

    As we plan for the next generation of Mission Operations Control Center (MOCC) systems, there are many opportunities for the increased utilization of innovative knowledge-based technologies. The innovative technology discussed is an advanced use of agent-oriented approaches to the automation of mission operations. The paper presents an overview of this technology and discusses applied operational scenarios currently being investigated and prototyped. A major focus of the current work is the development of a simple user mechanism that would empower operations staff members to create, in real time, software agents to assist them in common, labor intensive operations tasks. These operational tasks would include: handling routine data and information management functions; amplifying the capabilities of a spacecraft analyst/operator to rapidly identify, analyze, and correct spacecraft anomalies by correlating complex data/information sets and filtering error messages; improving routine monitoring and trend analysis by detecting common failure signatures; and serving as a sentinel for spacecraft changes during critical maneuvers enhancing the system's capabilities to support nonroutine operational conditions with minimum additional staff. An agent-based testbed is under development. This testbed will allow us to: (1) more clearly understand the intricacies of applying agent-based technology in support of the advanced automation of mission operations and (2) access the full set of benefits that can be realized by the proper application of agent-oriented technology in a mission operations environment. The testbed under development addresses some of the data management and report generation functions for the Explorer Platform (EP)/Extreme UltraViolet Explorer (EUVE) Flight Operations Team (FOT). We present an overview of agent-oriented technology and a detailed report on the operation's concept for the testbed.

  6. Fast and Accurate Approaches for Large-Scale, Automated Mapping of Food Diaries on Food Composition Tables

    Directory of Open Access Journals (Sweden)

    Marc Lamarine

    2018-05-01

    Full Text Available Aim of Study: The use of weighed food diaries in nutritional studies provides a powerful method to quantify food and nutrient intakes. Yet, mapping these records onto food composition tables (FCTs is a challenging, time-consuming and error-prone process. Experts make this effort manually and no automation has been previously proposed. Our study aimed to assess automated approaches to map food items onto FCTs.Methods: We used food diaries (~170,000 records pertaining to 4,200 unique food items from the DiOGenes randomized clinical trial. We attempted to map these items onto six FCTs available from the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching. The second used a machine learning approach (C5.0 classifier combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English-translation. Top matching pairs were reviewed manually to derive performance metrics: precision (the percentage of correctly mapped items and recall (percentage of mapped items.Results: The simpler approach: fuzzy matching, provided very good performance. Under a relaxed threshold (score > 50%, this approach enabled to remap 99.49% of the items with a precision of 88.75%. With a slightly more stringent threshold (score > 63%, the precision could be significantly improved to 96.81% while keeping a recall rate > 95% (i.e., only 5% of the queried items would not be mapped. The machine learning approach did not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names. Our approaches have been implemented as R packages and are freely available from GitHub.Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We

  7. Automated delineation and characterization of drumlins using a localized contour tree approach

    Science.gov (United States)

    Wang, Shujie; Wu, Qiusheng; Ward, Dylan

    2017-10-01

    Drumlins are ubiquitous landforms in previously glaciated regions, formed through a series of complex subglacial processes operating underneath the paleo-ice sheets. Accurate delineation and characterization of drumlins are essential for understanding the formation mechanism of drumlins as well as the flow behaviors and basal conditions of paleo-ice sheets. Automated mapping of drumlins is particularly important for examining the distribution patterns of drumlins across large spatial scales. This paper presents an automated vector-based approach to mapping drumlins from high-resolution light detection and ranging (LiDAR) data. The rationale is to extract a set of concentric contours by building localized contour trees and establishing topological relationships. This automated method can overcome the shortcomings of previously manual and automated methods for mapping drumlins, for instance, the azimuthal biases during the generation of shaded relief images. A case study was carried out over a portion of the New York Drumlin Field. Overall 1181 drumlins were identified from the LiDAR-derived DEM across the study region, which had been underestimated in previous literature. The delineation results were visually and statistically compared to the manual digitization results. The morphology of drumlins was characterized by quantifying the length, width, elongation ratio, height, area, and volume. Statistical and spatial analyses were conducted to examine the distribution pattern and spatial variability of drumlin size and form. The drumlins and the morphologic characteristics exhibit significant spatial clustering rather than randomly distributed patterns. The form of drumlins varies from ovoid to spindle shapes towards the downstream direction of paleo ice flows, along with the decrease in width, area, and volume. This observation is in line with previous studies, which may be explained by the variations in sediment thickness and/or the velocity increases of ice flows

  8. Two Automated Techniques for Carotid Lumen Diameter Measurement: Regional versus Boundary Approaches.

    Science.gov (United States)

    Araki, Tadashi; Kumar, P Krishna; Suri, Harman S; Ikeda, Nobutaka; Gupta, Ajay; Saba, Luca; Rajan, Jeny; Lavra, Francesco; Sharma, Aditya M; Shafique, Shoaib; Nicolaides, Andrew; Laird, John R; Suri, Jasjit S

    2016-07-01

    The degree of stenosis in the carotid artery can be predicted using automated carotid lumen diameter (LD) measured from B-mode ultrasound images. Systolic velocity-based methods for measurement of LD are subjective. With the advancement of high resolution imaging, image-based methods have started to emerge. However, they require robust image analysis for accurate LD measurement. This paper presents two different algorithms for automated segmentation of the lumen borders in carotid ultrasound images. Both algorithms are modeled as a two stage process. Stage one consists of a global-based model using scale-space framework for the extraction of the region of interest. This stage is common to both algorithms. Stage two is modeled using a local-based strategy that extracts the lumen interfaces. At this stage, the algorithm-1 is modeled as a region-based strategy using a classification framework, whereas the algorithm-2 is modeled as a boundary-based approach that uses the level set framework. Two sets of databases (DB), Japan DB (JDB) (202 patients, 404 images) and Hong Kong DB (HKDB) (50 patients, 300 images) were used in this study. Two trained neuroradiologists performed manual LD tracings. The mean automated LD measured was 6.35 ± 0.95 mm for JDB and 6.20 ± 1.35 mm for HKDB. The precision-of-merit was: 97.4 % and 98.0 % w.r.t to two manual tracings for JDB and 99.7 % and 97.9 % w.r.t to two manual tracings for HKDB. Statistical tests such as ANOVA, Chi-Squared, T-test, and Mann-Whitney test were conducted to show the stability and reliability of the automated techniques.

  9. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  10. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  11. Damage Detection with Streamlined Structural Health Monitoring Data

    OpenAIRE

    Li, Jian; Deng, Jun; Xie, Weizhi

    2015-01-01

    The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM) systems often overwhelms the systems’ capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compressio...

  12. Zephyr: A secure Internet process to streamline engineering

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, C.W.; Niven, W.A.; Cavitt, R.E. [and others

    1998-05-12

    Lawrence Livermore National Laboratory (LLNL) is implementing an Internet-based process pilot called `Zephyr` to streamline engineering and commerce using the Internet. Major benefits have accrued by using Zephyr in facilitating industrial collaboration, speeding the engineering development cycle, reducing procurement time, and lowering overall costs. Programs at LLNL are potentializing the efficiencies introduced since implementing Zephyr. Zephyr`s pilot functionality is undergoing full integration with Business Systems, Finance, and Vendors to support major programs at the Laboratory.

  13. Streamlining the license renewal review process

    International Nuclear Information System (INIS)

    Dozier, J.; Lee, S.; Kuo, P.T.

    2001-01-01

    The staff of the NRC has been developing three regulatory guidance documents for license renewal: the Generic Aging Lessons Learned (GALL) report, Standard Review Plan for License Renewal (SRP-LR), and Regulatory Guide (RG) for Standard Format and Content for Applications to Renew Nuclear Power Plant Operating Licenses. These documents are designed to streamline the license renewal review process by providing clear guidance for license renewal applicants and the NRC staff in preparing and reviewing license renewal applications. The GALL report systematically catalogs aging effects on structures and components; identifies the relevant existing plant programs; and evaluates the existing programs against the attributes considered necessary for an aging management program to be acceptable for license renewal. The GALL report also provides guidance for the augmentation of existing plant programs for license renewal. The revised SRP-LR allows an applicant to reference the GALL report to preclude further NRC staff evaluation if the plant's existing programs meet the criteria described in the GALL report. During the review process, the NRC staff will focus primarily on existing programs that should be augmented or new programs developed specifically for license renewal. The Regulatory Guide is expected to endorse the Nuclear Energy Institute (NEI) guideline, NEI 95-10, Revision 2, entitled 'Industry Guideline for Implementing the Requirements of 10 CFR Part 54 - The License Renewal Rule', which provides guidance for preparing a license renewal application. This paper will provide an introduction to the GALL report, SRP-LR, Regulatory Guide, and NEI 95-10 to show how these documents are interrelated and how they will be used to streamline the license renewal review process. This topic will be of interest to domestic power utilities considering license renewal and international ICONE participants seeking state-of-the-art information about license renewal in the United States

  14. Automating Groundwater Sampling At Hanford, The Next Step

    International Nuclear Information System (INIS)

    Connell, C.W.; Conley, S.F.; Hildebrand, R.D.; Cunningham, D.E.

    2010-01-01

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very 'people intensive.' Approximately 1500 wells are sampled each year by field personnel or 'samplers.' These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  15. Automated mango fruit assessment using fuzzy logic approach

    Science.gov (United States)

    Hasan, Suzanawati Abu; Kin, Teoh Yeong; Sauddin@Sa'duddin, Suraiya; Aziz, Azlan Abdul; Othman, Mahmod; Mansor, Ab Razak; Parnabas, Vincent

    2014-06-01

    In term of value and volume of production, mango is the third most important fruit product next to pineapple and banana. Accurate size assessment of mango fruits during harvesting is vital to ensure that they are classified to the grade accordingly. However, the current practice in mango industry is grading the mango fruit manually using human graders. This method is inconsistent, inefficient and labor intensive. In this project, a new method of automated mango size and grade assessment is developed using RGB fiber optic sensor and fuzzy logic approach. The calculation of maximum, minimum and mean values based on RGB fiber optic sensor and the decision making development using minimum entropy formulation to analyse the data and make the classification for the mango fruit. This proposed method is capable to differentiate three different grades of mango fruit automatically with 77.78% of overall accuracy compared to human graders sorting. This method was found to be helpful for the application in the current agricultural industry.

  16. A streamlined failure mode and effects analysis

    International Nuclear Information System (INIS)

    Ford, Eric C.; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-01-01

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed

  17. A streamlined failure mode and effects analysis.

    Science.gov (United States)

    Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-06-01

    Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  18. A streamlined failure mode and effects analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD 21287 (United States)

    2014-06-15

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  19. Streamlining Smart Meter Data Analytics

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    2015-01-01

    of the so-called big data possible. This can improve energy management, e.g., help utilities improve the management of energy and services, and help customers save money. As this regard, the paper focuses on building an innovative software solution to streamline smart meter data analytic, aiming at dealing......Today smart meters are increasingly used in worldwide. Smart meters are the advanced meters capable of measuring customer energy consumption at a fine-grained time interval, e.g., every 15 minutes. The data are very sizable, and might be from different sources, along with the other social......-economic metrics such as the geographic information of meters, the information about users and their property, geographic location and others, which make the data management very complex. On the other hand, data-mining and the emerging cloud computing technologies make the collection, management, and analysis...

  20. The impact of groundwater velocity fields on streamlines in an aquifer system with a discontinuous aquitard (Inner Mongolia, China)

    Science.gov (United States)

    Wu, Qiang; Zhao, Yingwang; Xu, Hua

    2018-04-01

    Many numerical methods that simulate groundwater flow, particularly the continuous Galerkin finite element method, do not produce velocity information directly. Many algorithms have been proposed to improve the accuracy of velocity fields computed from hydraulic potentials. The differences in the streamlines generated from velocity fields obtained using different algorithms are presented in this report. The superconvergence method employed by FEFLOW, a popular commercial code, and some dual-mesh methods proposed in recent years are selected for comparison. The applications to depict hydrogeologic conditions using streamlines are used, and errors in streamlines are shown to lead to notable errors in boundary conditions, the locations of material interfaces, fluxes and conductivities. Furthermore, the effects of the procedures used in these two types of methods, including velocity integration and local conservation, are analyzed. The method of interpolating velocities across edges using fluxes is shown to be able to eliminate errors associated with refraction points that are not located along material interfaces and streamline ends at no-flow boundaries. Local conservation is shown to be a crucial property of velocity fields and can result in more accurate streamline densities. A case study involving both three-dimensional and two-dimensional cross-sectional models of a coal mine in Inner Mongolia, China, are used to support the conclusions presented.

  1. Planning for Office Automation.

    Science.gov (United States)

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  2. Self streamlining wind tunnel: Further low speed testing and final design studies for the transonic facility

    Science.gov (United States)

    Wolf, S. W. D.

    1978-01-01

    Work was continued with the low speed self streamlining wind tunnel (SSWT) using the NACA 0012-64 airfoil in an effort to explain the discrepancies between the NASA Langley low turbulence pressure tunnel (LTPT) and SSWT results obtained with the airfoil stalled. Conventional wind tunnel corrections were applied to straight wall SSWT airfoil data, to illustrate the inadequacy of standard correction techniques in circumstances of high blockage. Also one SSWT test was re-run at different air speeds to investigate the effects of such changes (perhaps through changes in Reynold's number and freestream turbulence levels) on airfoil data and wall contours. Mechanical design analyses for the transonic self-streamlining wind tunnel (TSWT) were completed by the application of theoretical airfoil flow field data to the elastic beam and streamline analysis. The control system for the transonic facility, which will eventually allow on-line computer operation of the wind tunnel, was outlined.

  3. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...

  4. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Surrogate Based Optimization of Aerodynamic Noise for Streamlined Shape of High Speed Trains

    Directory of Open Access Journals (Sweden)

    Zhenxu Sun

    2017-02-01

    Full Text Available Aerodynamic noise increases with the sixth power of the running speed. As the speed increases, aerodynamic noise becomes predominant and begins to be the main noise source at a certain high speed. As a result, aerodynamic noise has to be focused on when designing new high-speed trains. In order to perform the aerodynamic noise optimization, the equivalent continuous sound pressure level (SPL has been used in the present paper, which could take all of the far field observation probes into consideration. The Non-Linear Acoustics Solver (NLAS approach has been utilized for acoustic calculation. With the use of Kriging surrogate model, a multi-objective optimization of the streamlined shape of high-speed trains has been performed, which takes the noise level in the far field and the drag of the whole train as the objectives. To efficiently construct the Kriging model, the cross validation approach has been adopted. Optimization results reveal that both the equivalent continuous sound pressure level and the drag of the whole train are reduced in a certain extent.

  6. A Generic Deep-Learning-Based Approach for Automated Surface Inspection.

    Science.gov (United States)

    Ren, Ruoxu; Hung, Terence; Tan, Kay Chen

    2018-03-01

    Automated surface inspection (ASI) is a challenging task in industry, as collecting training dataset is usually costly and related methods are highly dataset-dependent. In this paper, a generic approach that requires small training data for ASI is proposed. First, this approach builds classifier on the features of image patches, where the features are transferred from a pretrained deep learning network. Next, pixel-wise prediction is obtained by convolving the trained classifier over input image. An experiment on three public and one industrial data set is carried out. The experiment involves two tasks: 1) image classification and 2) defect segmentation. The results of proposed algorithm are compared against several best benchmarks in literature. In the classification tasks, the proposed method improves accuracy by 0.66%-25.50%. In the segmentation tasks, the proposed method reduces error escape rates by 6.00%-19.00% in three defect types and improves accuracies by 2.29%-9.86% in all seven defect types. In addition, the proposed method achieves 0.0% error escape rate in the segmentation task of industrial data.

  7. Computer system architecture for laboratory automation

    International Nuclear Information System (INIS)

    Penney, B.K.

    1978-01-01

    This paper describes the various approaches that may be taken to provide computing resources for laboratory automation. Three distinct approaches are identified, the single dedicated small computer, shared use of a larger computer, and a distributed approach in which resources are provided by a number of computers, linked together, and working in some cooperative way. The significance of the microprocessor in laboratory automation is discussed, and it is shown that it is not simply a cheap replacement of the minicomputer. (Auth.)

  8. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    Science.gov (United States)

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  9. Streamline topology: Patterns in fluid flows and their bifurcations

    DEFF Research Database (Denmark)

    Brøns, Morten

    2007-01-01

    Using dynamical systems theory, we consider structures such as vortices and separation in the streamline patterns of fluid flows. Bifurcation of patterns under variation of external parameters is studied using simplifying normal form transformations. Flows away from boundaries, flows close to fix...... walls, and axisymmetric flows are analyzed in detail. We show how to apply the ideas from the theory to analyze numerical simulations of the vortex breakdown in a closed cylindrical container....

  10. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 408: Bomblet Target Area, Tonopah Test Range, Nevada

    International Nuclear Information System (INIS)

    NSTec Environmental Management

    2006-01-01

    This Streamlined Approach for Environmental Restoration Plan provides the details for the closure of Corrective Action Unit (CAU) 408, Bomblet Target Area. CAU 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order of 1996. One Corrective Action Site (CAS) is included in CAU 408: (lg b ullet) CAS TA-55-002-TAB2, Bomblet Target Areas Based on historical documentation, personnel interviews, process knowledge, site visits, aerial photography, multispectral data, preliminary geophysical surveys, and the results of data quality objectives process (Section 3.0), clean closure will be implemented for CAU 408. CAU 408 closure activities will consist of identification and clearance of bomblet target areas, identification and removal of depleted uranium (DU) fragments on South Antelope Lake, and collection of verification samples. Any soil containing contaminants at concentrations above the action levels will be excavated and transported to an appropriate disposal facility. Based on existing information, contaminants of potential concern at CAU 408 include explosives. In addition, at South Antelope Lake, bomblets containing DU were tested. None of these contaminants is expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results. The corrective action investigation and closure activities have been planned to include data collection and hold points throughout the process. Hold points are designed to allow decision makers to review the existing data and decide which of the available options are most suitable. Hold points include the review of radiological, geophysical, and analytical data and field observations

  11. Streamlining the Online Course Development Process by Using Project Management Tools

    Science.gov (United States)

    Abdous, M'hammed; He, Wu

    2008-01-01

    Managing the design and production of online courses is challenging. Insufficient instructional design and inefficient management often lead to issues such as poor course quality and course delivery delays. In an effort to facilitate, streamline, and improve the overall design and production of online courses, this article discusses how we…

  12. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 107: Low Impact Soil Sites, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    2009-01-01

    This Streamlined Approach for Environmental Restoration Plan covers activities associated with Corrective Action Unit (CAU) 107 of the Federal Facility Agreement and Consent Order (1996 (as amended February 2008)). CAU 107 consists of the following Corrective Action Sites (CASs) located in Areas 1, 2, 3, 4, 5, 9, 10, and 18 of the Nevada Test Site. (sm b ullet) CAS 01-23-02, Atmospheric Test Site - High Alt(sm b ullet) CAS 02-23-02, Contaminated Areas (2)(sm b ullet) CAS 02-23-03, Contaminated Berm(sm b ullet) CAS 02-23-10, Gourd-Amber Contamination Area(sm b ullet) CAS 02-23-11, Sappho Contamination Area(sm b ullet) CAS 02-23-12, Scuttle Contamination Area(sm b ullet) CAS 03-23-24, Seaweed B Contamination Area(sm b ullet) CAS 03-23-27, Adze Contamination Area(sm b ullet) CAS 03-23-28, Manzanas Contamination Area(sm b ullet) CAS 03-23-29, Truchas-Chamisal Contamination Area(sm b ullet) CAS 04-23-02, Atmospheric Test Site T4-a(sm b ullet) CAS 05-23-06, Atmospheric Test Site(sm b ullet) CAS 09-23-06, Mound of Contaminated Soil(sm b ullet) CAS 10-23-04, Atmospheric Test Site M-10(sm b ullet) CAS 18-23-02, U-18d Crater (Sulky) Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, engineering drawings, field screening, analytical results, and the results of data quality objectives process (Section 3.0), closure in place with administrative controls or no further action will be implemented for CAU 107.

  13. Streamlining Compliance Validation Through Automation Processes

    Science.gov (United States)

    2014-03-01

    INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS ACAS Assured Compliance Assessment Suite AMP Apache- MySQL -PHP ANSI American...enemy. Of course , a common standard for DoD security personnel to write and share compliance validation content would prevent duplicate work and aid in...process and consume much of the SCAP content available. Finally, it is free and easy to install as part of the Apache/ MySQL /PHP (AMP) [37

  14. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  15. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  16. A data-driven multiplicative fault diagnosis approach for automation processes.

    Science.gov (United States)

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  17. The role of streamline curvature in sand dune dynamics: evidence from field and wind tunnel measurements

    Science.gov (United States)

    Wiggs, Giles F. S.; Livingstone, Ian; Warren, Andrew

    1996-09-01

    Field measurements on an unvegetated, 10 m high barchan dune in Oman are compared with measurements over a 1:200 scale fixed model in a wind tunnel. Both the field and wind tunnel data demonstrate similar patterns of wind and shear velocity over the dune, confirming significant flow deceleration upwind of and at the toe of the dune, acceleration of flow up the windward slope, and deceleration between the crest and brink. This pattern, including the widely reported upwind reduction in shear velocity, reflects observations of previous studies. Such a reduction in shear velocity upwind of the dune should result in a reduction in sand transport and subsequent sand deposition. This is not observed in the field. Wind tunnel modelling using a near-surface pulse-wire probe suggests that the field method of shear velocity derivation is inadequate. The wind tunnel results exhibit no reduction in shear velocity upwind of or at the toe of the dune. Evidence provided by Reynolds stress profiles and turbulence intensities measured in the wind tunnel suggest that this maintenance of upwind shear stress may be a result of concave (unstable) streamline curvature. These additional surface stresses are not recorded by the techniques used in the field measurements. Using the occurrence of streamline curvature as a starting point, a new 2-D model of dune dynamics is deduced. This model relies on the establishment of an equilibrium between windward slope morphology, surface stresses induced by streamline curvature, and streamwise acceleration. Adopting the criteria that concave streamline curvature and streamwise acceleration both increase surface shear stress, whereas convex streamline curvature and deceleration have the opposite effect, the relationships between form and process are investigated in each of three morphologically distinct zones: the upwind interdune and concave toe region of the dune, the convex portion of the windward slope, and the crest-brink region. The

  18. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  19. Stable–streamlined and helical cavities following the impact of Leidenfrost spheres

    KAUST Repository

    Mansoor, Mohammad M.

    2017-06-23

    We report results from an experimental study on the formation of stable–streamlined and helical cavity wakes following the free-surface impact of Leidenfrost spheres. Similar to the observations of Mansoor et al. (J. Fluid Mech., vol. 743, 2014, pp. 295–326), we show that acoustic ripples form along the interface of elongated cavities entrained in the presence of wall effects as soon as the primary cavity pinch-off takes place. The crests of these ripples can act as favourable points for closure, producing multiple acoustic pinch-offs, which are found to occur in an acoustic pinch-off cascade. We show that these ripples pacify with time in the absence of physical contact between the sphere and the liquid, leading to extremely smooth cavity wake profiles. More importantly, the downward-facing jet at the apex of the cavity is continually suppressed due to a skin-friction drag effect at the colliding cavity-wall junction, which ultimately produces a stable–streamlined cavity wake. This streamlined configuration is found to experience drag coefficients an order of a magnitude lower than those acting on room-temperature spheres. A striking observation is the formation of helical cavities which occur for impact Reynolds numbers and are characterized by multiple interfacial ridges, stemming from and rotating synchronously about an evident contact line around the sphere equator. The contact line is shown to result from the degeneration of Kelvin–Helmholtz billows into turbulence which are observed forming along the liquid–vapour interface around the bottom hemisphere of the sphere. Using sphere trajectory measurements, we show that this helical cavity wake configuration has 40 %–55 % smaller force coefficients than those obtained in the formation of stable cavity wakes.

  20. Stable–streamlined and helical cavities following the impact of Leidenfrost spheres

    KAUST Repository

    Mansoor, Mohammad M.; Vakarelski, Ivan Uriev; Marston, J. O.; Truscott, T. T.; Thoroddsen, Sigurdur T

    2017-01-01

    We report results from an experimental study on the formation of stable–streamlined and helical cavity wakes following the free-surface impact of Leidenfrost spheres. Similar to the observations of Mansoor et al. (J. Fluid Mech., vol. 743, 2014, pp. 295–326), we show that acoustic ripples form along the interface of elongated cavities entrained in the presence of wall effects as soon as the primary cavity pinch-off takes place. The crests of these ripples can act as favourable points for closure, producing multiple acoustic pinch-offs, which are found to occur in an acoustic pinch-off cascade. We show that these ripples pacify with time in the absence of physical contact between the sphere and the liquid, leading to extremely smooth cavity wake profiles. More importantly, the downward-facing jet at the apex of the cavity is continually suppressed due to a skin-friction drag effect at the colliding cavity-wall junction, which ultimately produces a stable–streamlined cavity wake. This streamlined configuration is found to experience drag coefficients an order of a magnitude lower than those acting on room-temperature spheres. A striking observation is the formation of helical cavities which occur for impact Reynolds numbers and are characterized by multiple interfacial ridges, stemming from and rotating synchronously about an evident contact line around the sphere equator. The contact line is shown to result from the degeneration of Kelvin–Helmholtz billows into turbulence which are observed forming along the liquid–vapour interface around the bottom hemisphere of the sphere. Using sphere trajectory measurements, we show that this helical cavity wake configuration has 40 %–55 % smaller force coefficients than those obtained in the formation of stable cavity wakes.

  1. Streamlining Collaboration for the Gravitational-wave Astronomy Community

    Science.gov (United States)

    Koranda, S.

    2016-12-01

    In the morning hours of September 14, 2015 the LaserInterferometer Gravitational-wave Observatory (LIGO) directlydetected gravitational waves from inspiraling and coalescingblack holes, confirming a major prediction of AlbertEinstein's general theory of relativity and beginning the eraof gravitational-wave astronomy. With the LIGO detectors in the United States, the Virgo andGEO detectors in Europe, and the KAGRA detector in Japan thegravitational-wave astrononmy community is opening a newwindow on our Universe. Realizing the full science potentialof LIGO and the other interferometers requires globalcollaboration not only within the gravitational-wave astronomycommunity but also with the astronomers and astrophysicists acrossmultipe disciplines working to realize and leverage the powerof multi-messenger astronomy. Enabling thousands of researchers from around the world andacross multiple projects to efficiently collaborate, share,and analyze data and provide streamlined access to services,computing, and tools requires new and scalable approaches toidentity and access management (IAM). We will discuss LIGO'sIAM journey that began in 2007 and how today LIGO leveragesinternal identity federations like InCommon and eduGAIN toprovide scalable and managed access for the gravitational-waveastronomy community. We will discuss the steps both largeand small research organizations and projects take as theirIAM infrastructure matures from ad-hoc silos of independent services to fully integrated and federated services thatstreamline collaboration so that scientists can focus onresearch and not managing passwords.

  2. The streamline upwind Petrov-Galerkin stabilising method for the numerical solution of highly advective problems

    Directory of Open Access Journals (Sweden)

    Carlos Humberto Galeano Urueña

    2009-05-01

    Full Text Available This article describes the streamline upwind Petrov-Galerkin (SUPG method as being a stabilisation technique for resolving the diffusion-advection-reaction equation by finite elements. The first part of this article has a short analysis of the importance of this type of differential equation in modelling physical phenomena in multiple fields. A one-dimensional description of the SUPG me- thod is then given to extend this basis to two and three dimensions. The outcome of a strongly advective and a high numerical complexity experiment is presented. The results show how the version of the implemented SUPG technique allowed stabilised approaches in space, even for high Peclet numbers. Additional graphs of the numerical experiments presented here can be downloaded from www.gnum.unal.edu.co.

  3. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  4. A Streamlined Artificial Variable Free Version of Simplex Method

    OpenAIRE

    Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad

    2015-01-01

    This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new ...

  5. An automated and fast approach to detect single-trial visual evoked potentials with application to brain-computer interface.

    Science.gov (United States)

    Tu, Yiheng; Hung, Yeung Sam; Hu, Li; Huang, Gan; Hu, Yong; Zhang, Zhiguo

    2014-12-01

    This study aims (1) to develop an automated and fast approach for detecting visual evoked potentials (VEPs) in single trials and (2) to apply the single-trial VEP detection approach in designing a real-time and high-performance brain-computer interface (BCI) system. The single-trial VEP detection approach uses common spatial pattern (CSP) as a spatial filter and wavelet filtering (WF) a temporal-spectral filter to jointly enhance the signal-to-noise ratio (SNR) of single-trial VEPs. The performance of the joint spatial-temporal-spectral filtering approach was assessed in a four-command VEP-based BCI system. The offline classification accuracy of the BCI system was significantly improved from 67.6±12.5% (raw data) to 97.3±2.1% (data filtered by CSP and WF). The proposed approach was successfully implemented in an online BCI system, where subjects could make 20 decisions in one minute with classification accuracy of 90%. The proposed single-trial detection approach is able to obtain robust and reliable VEP waveform in an automatic and fast way and it is applicable in VEP based online BCI systems. This approach provides a real-time and automated solution for single-trial detection of evoked potentials or event-related potentials (EPs/ERPs) in various paradigms, which could benefit many applications such as BCI and intraoperative monitoring. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 2: Approaches based on impregnated membranes and porous supports.

    Science.gov (United States)

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-11

    A critical overview on automation of modern liquid phase microextraction (LPME) approaches based on the liquid impregnation of porous sorbents and membranes is presented. It is the continuation of part 1, in which non-dispersive LPME techniques based on the use of the extraction phase (EP) in the form of drop, plug, film, or microflow have been surveyed. Compared to the approaches described in part 1, porous materials provide an improved support for the EP. Simultaneously they allow to enlarge its contact surface and to reduce the risk of loss by incident flow or by components of surrounding matrix. Solvent-impregnated membranes or hollow fibres are further ideally suited for analyte extraction with simultaneous or subsequent back-extraction. Their use can therefore improve the procedure robustness and reproducibility as well as it "opens the door" to the new operation modes and fields of application. However, additional work and time are required for membrane replacement and renewed impregnation. Automation of porous support-based and membrane-based approaches plays an important role in the achievement of better reliability, rapidness, and reproducibility compared to manual assays. Automated renewal of the extraction solvent and coupling of sample pretreatment with the detection instrumentation can be named as examples. The different LPME methodologies using impregnated membranes and porous supports for the extraction phase and the different strategies of their automation, and their analytical applications are comprehensively described and discussed in this part. Finally, an outlook on future demands and perspectives of LPME techniques from both parts as a promising area in the field of sample pretreatment is given. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Automated Subsystem Control for Life Support System (ASCLSS)

    Science.gov (United States)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  8. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  9. Alternative approach to automated management of load flow in engineering networks considering functional reliability

    Directory of Open Access Journals (Sweden)

    Ирина Александровна Гавриленко

    2016-02-01

    Full Text Available The approach to automated management of load flow in engineering networks considering functional reliability was proposed in the article. The improvement of the concept of operational and strategic management of load flow in engineering networks was considered. The verbal statement of the problem for thesis research is defined, namely, the problem of development of information technology for exact calculation of the functional reliability of the network, or the risk of short delivery of purpose-oriented product for consumers

  10. Shuttle Repair Tools Automate Vehicle Maintenance

    Science.gov (United States)

    2013-01-01

    Successfully building, flying, and maintaining the space shuttles was an immensely complex job that required a high level of detailed, precise engineering. After each shuttle landed, it entered a maintenance, repair, and overhaul (MRO) phase. Each system was thoroughly checked and tested, and worn or damaged parts replaced, before the shuttle was rolled out for its next mission. During the MRO period, workers needed to record exactly what needed replacing and why, as well as follow precise guidelines and procedures in making their repairs. That meant traceability, and with it lots of paperwork. In 2007, the number of reports generated during electrical system repairs was getting out of hand-placing among the top three systems in terms of paperwork volume. Repair specialists at Kennedy Space Center were unhappy spending so much time at a desk and so little time actually working on the shuttle. "Engineers weren't spending their time doing technical work," says Joseph Schuh, an electrical engineer at Kennedy. "Instead, they were busy with repetitive, time-consuming processes that, while important in their own right, provided a low return on time invested." The strain of such inefficiency was bad enough that slow electrical repairs jeopardized rollout on several occasions. Knowing there had to be a way to streamline operations, Kennedy asked Martin Belson, a project manager with 30 years experience as an aerospace contractor, to co-lead a team in developing software that would reduce the effort required to document shuttle repairs. The result was System Maintenance Automated Repair Tasks (SMART) software. SMART is a tool for aggregating and applying information on every aspect of repairs, from procedures and instructions to a vehicle s troubleshooting history. Drawing on that data, SMART largely automates the processes of generating repair instructions and post-repair paperwork. In the case of the space shuttle, this meant that SMART had 30 years worth of operations

  11. A Semi-automated Approach to Improve the Efficiency of Medical Imaging Segmentation for Haptic Rendering.

    Science.gov (United States)

    Banerjee, Pat; Hu, Mengqi; Kannan, Rahul; Krishnaswamy, Srinivasan

    2017-08-01

    The Sensimmer platform represents our ongoing research on simultaneous haptics and graphics rendering of 3D models. For simulation of medical and surgical procedures using Sensimmer, 3D models must be obtained from medical imaging data, such as magnetic resonance imaging (MRI) or computed tomography (CT). Image segmentation techniques are used to determine the anatomies of interest from the images. 3D models are obtained from segmentation and their triangle reduction is required for graphics and haptics rendering. This paper focuses on creating 3D models by automating the segmentation of CT images based on the pixel contrast for integrating the interface between Sensimmer and medical imaging devices, using the volumetric approach, Hough transform method, and manual centering method. Hence, automating the process has reduced the segmentation time by 56.35% while maintaining the same accuracy of the output at ±2 voxels.

  12. Improving medical stores management through automation and effective communication.

    Science.gov (United States)

    Kumar, Ashok; Cariappa, M P; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan-Do-Study-Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management.

  13. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 538: Spill Sites, Nevada Test Site, Nevada, Rev. No.: 0

    Energy Technology Data Exchange (ETDEWEB)

    Alfred Wickline

    2006-04-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions necessary for the closure of Corrective Action Unit (CAU) 538: Spill Sites, Nevada Test Site, Nevada. It has been developed in accordance with the ''Federal Facility Agreement and Consent Order'' (FFACO) (1996) that was agreed to by the State of Nevada, the U.S. Department of Energy (DOE), and the U.S. Department of Defense. A SAFER may be performed when the following criteria are met: (1) Conceptual corrective actions are clearly identified (although some degree of investigation may be necessary to select a specific corrective action before completion of the Corrective Action Investigation [CAI]). (2) Uncertainty of the nature, extent, and corrective action must be limited to an acceptable level of risk. (3) The SAFER Plan includes decision points and criteria for making data quality objective (DQO) decisions. The purpose of the investigation will be to document and verify the adequacy of existing information; to affirm the decision for either clean closure, closure in place, or no further action; and to provide sufficient data to implement the corrective action. The actual corrective action selected will be based on characterization activities implemented under this SAFER Plan. This SAFER Plan identifies decision points developed in cooperation with the Nevada Division of Environmental Protection (NDEP) and where DOE will reach consensus with NDEP before beginning the next phase of work.

  14. The standard laboratory module approach to automation of the chemical laboratory

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.H.

    1993-01-01

    Automation of the technology and practice of environmental laboratory automation has not been as rapid or complete as one might expect. Confined to autosamplers and limited robotic systems, our ability to apply production concepts to environmental analytical analysis is not great. With the impending remediation of our hazardous waste sites in the US, only the application of production chemistry techniques will even begin to provide those responsible with the necessary knowledge to accomplish the cleanup expeditiously and safely. Tightening regulatory requirements have already mandated staggering increases in sampling and characterization needs with the future only guaranteeing greater demands. The Contaminant Analysis Automation Program has been initiated by our government to address these current and future characterization by application of a new robotic paradigm for analytical chemistry. By using standardized modular instruments, named Standard Laboratory Modules, flexible automation systems can rapidly be configured to apply production techniques to our nations environmental problems at-site

  15. An Integrated Approach to Characterizing Bypassed Oil in Heterogeneous and Fractured Reservoirs Using Partitioning Tracers

    Energy Technology Data Exchange (ETDEWEB)

    Akhil Datta-Gupta

    2006-12-31

    We explore the use of efficient streamline-based simulation approaches for modeling partitioning interwell tracer tests in hydrocarbon reservoirs. Specifically, we utilize the unique features of streamline models to develop an efficient approach for interpretation and history matching of field tracer response. A critical aspect here is the underdetermined and highly ill-posed nature of the associated inverse problems. We have investigated the relative merits of the traditional history matching ('amplitude inversion') and a novel travel time inversion in terms of robustness of the method and convergence behavior of the solution. We show that the traditional amplitude inversion is orders of magnitude more non-linear and the solution here is likely to get trapped in local minimum, leading to inadequate history match. The proposed travel time inversion is shown to be extremely efficient and robust for practical field applications. The streamline approach is generalized to model water injection in naturally fractured reservoirs through the use of a dual media approach. The fractures and matrix are treated as separate continua that are connected through a transfer function, as in conventional finite difference simulators for modeling fractured systems. A detailed comparison with a commercial finite difference simulator shows very good agreement. Furthermore, an examination of the scaling behavior of the computation time indicates that the streamline approach is likely to result in significant savings for large-scale field applications. We also propose a novel approach to history matching finite-difference models that combines the advantage of the streamline models with the versatility of finite-difference simulation. In our approach, we utilize the streamline-derived sensitivities to facilitate history matching during finite-difference simulation. The use of finite-difference model allows us to account for detailed process physics and compressibility effects

  16. Investigating the effects of streamline-based fiber tractography on matrix scaling in brain connective network.

    Science.gov (United States)

    Jan, Hengtai; Chao, Yi-Ping; Cho, Kuan-Hung; Kuo, Li-Wei

    2013-01-01

    Investigating the brain connective network using the modern graph theory has been widely applied in cognitive and clinical neuroscience research. In this study, we aimed to investigate the effects of streamline-based fiber tractography on the change of network properties and established a systematic framework to understand how an adequate network matrix scaling can be determined. The network properties, including degree, efficiency and betweenness centrality, show similar tendency in both left and right hemispheres. By employing the curve-fitting process with exponential law and measuring the residuals, the association between changes of network properties and threshold of track numbers is found and an adequate range of investigating the lateralization of brain network is suggested. The proposed approach can be further applied in clinical applications to improve the diagnostic sensitivity using network analysis with graph theory.

  17. Scalable Device for Automated Microbial Electroporation in a Digital Microfluidic Platform.

    Science.gov (United States)

    Madison, Andrew C; Royal, Matthew W; Vigneault, Frederic; Chen, Liji; Griffin, Peter B; Horowitz, Mark; Church, George M; Fair, Richard B

    2017-09-15

    Electrowetting-on-dielectric (EWD) digital microfluidic laboratory-on-a-chip platforms demonstrate excellent performance in automating labor-intensive protocols. When coupled with an on-chip electroporation capability, these systems hold promise for streamlining cumbersome processes such as multiplex automated genome engineering (MAGE). We integrated a single Ti:Au electroporation electrode into an otherwise standard parallel-plate EWD geometry to enable high-efficiency transformation of Escherichia coli with reporter plasmid DNA in a 200 nL droplet. Test devices exhibited robust operation with more than 10 transformation experiments performed per device without cross-contamination or failure. Despite intrinsic electric-field nonuniformity present in the EP/EWD device, the peak on-chip transformation efficiency was measured to be 8.6 ± 1.0 × 10 8 cfu·μg -1 for an average applied electric field strength of 2.25 ± 0.50 kV·mm -1 . Cell survival and transformation fractions at this electroporation pulse strength were found to be 1.5 ± 0.3 and 2.3 ± 0.1%, respectively. Our work expands the EWD toolkit to include on-chip microbial electroporation and opens the possibility of scaling advanced genome engineering methods, like MAGE, into the submicroliter regime.

  18. Automated Mobility Transitions: Governing Processes in the UK

    Directory of Open Access Journals (Sweden)

    Debbie Hopkins

    2018-03-01

    Full Text Available Contemporary systems of mobility are undergoing a transition towards automation. In the UK, this transition is being led by (often new partnerships between incumbent manufacturers and new entrants, in collaboration with national governments, local/regional councils, and research institutions. This paper first offers a framework for analyzing the governance of the transition, adapting ideas from the Transition Management (TM perspective, and then applies the framework to ongoing automated vehicle transition dynamics in the UK. The empirical analysis suggests that the UK has adopted a reasonably comprehensive approach to the governing of automated vehicle innovation but that this approach cannot be characterized as sufficiently inclusive, democratic, diverse and open. The lack of inclusivity, democracy, diversity and openness is symptomatic of the post-political character of how the UK’s automated mobility transition is being governed. The paper ends with a call for a reconfiguration of the automated vehicle transition in the UK and beyond, so that much more space is created for dissent and for reflexive and comprehensive big picture thinking on (automated mobility futures.

  19. Automated subsystems control development. [for life support systems of space station

    Science.gov (United States)

    Block, R. F.; Heppner, D. B.; Samonski, F. H., Jr.; Lance, N., Jr.

    1985-01-01

    NASA has the objective to launch a Space Station in the 1990s. It has been found that the success of the Space Station engineering development, the achievement of initial operational capability (IOC), and the operation of a productive Space Station will depend heavily on the implementation of an effective automation and control approach. For the development of technology needed to implement the required automation and control function, a contract entitled 'Automated Subsystems Control for Life Support Systems' (ASCLSS) was awarded to two American companies. The present paper provides a description of the ASCLSS program. Attention is given to an automation and control architecture study, a generic automation and control approach for hardware demonstration, a standard software approach, application of Air Revitalization Group (ARG) process simulators, and a generic man-machine interface.

  20. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  1. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  2. Reactor pressure vessel stud management automation strategies

    International Nuclear Information System (INIS)

    Biach, W.L.; Hill, R.; Hung, K.

    1992-01-01

    The adoption of hydraulic tensioner technology as the standard for bolting and unbolting the reactor pressure vessel (RPV) head 35 yr ago represented an incredible commitment to new technology, but the existing technology was so primitive as to be clearly unacceptable. Today, a variety of approaches for improvement make the decision more difficult. Automation in existing installations must meet complex physical, logistic, and financial parameters while addressing the demands of reduced exposure, reduced critical path, and extended plant life. There are two generic approaches to providing automated RPV stud engagement and disengagement: the multiple stud tensioner and automated individual tools. A variation of the latter would include the handling system. Each has its benefits and liabilities

  3. Synthesis of tracers using automated radiochemistry and robotics

    International Nuclear Information System (INIS)

    Dannals, R.F.

    1992-07-01

    Synthesis of high specific activity radiotracers labeled with short-lived positron-emitting radionuclides for positron emission tomography (PET) often requires handling large initial quantities of radioactivity. High specific activities are required when preparing tracers for use in PET studies of neuroreceptors. A fully automated approach for tracer synthesis is highly desirable. This proposal involves the development of a system for the Synthesis of Tracers using Automated Radiochemistry and Robotics (STARR) for this purpose. While the long range objective of the proposed research is the development of a totally automated radiochemistry system for the production of major high specific activity 11 C-radiotracers for use in PET, the specific short range objectives are the automation of 11 C-methyl iodide ( 11 CH 3 I) production via an integrated approach using both radiochemistry modular labstations and robotics, and the extension of this automated capability to the production of several radiotracers for PET (initially, 11 C-methionine, 3-N-[ 11 C-methyl]spiperone, and [ 11 C]-carfentanil)

  4. Guessing right for the next war: streamlining, pooling, and right-timing force design decisions for an environment of uncertainty

    Science.gov (United States)

    2017-05-25

    key ingredients for not only how the Army fought World War II, but also how it continues to organize today. In essence , streamlining pares down every...Germans.1 The Battle of Mortain reflected the US Army in World War II at its best.2 It defined US Army success in the European theater of operations...continues to organize today.5 In essence , streamlining pared down every unit to its essentials based around a critical capability it provided to

  5. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 107: Low Impact Soil Sites, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    2008-01-01

    This Streamlined Approach for Environmental Restoration Plan covers activities associated with Corrective Action Unit (CAU) 107 of the Federal Facility Agreement and Consent Order (FFACO, 1996 (as amended February 2008)). CAU 107 consists of the following Corrective Action Sites (CASs) located in Areas 1, 2, 3, 4, 5, 9, 10, and 18 of the Nevada Test Site. (1) CAS 01-23-02, Atmospheric Test Site - High Alt; (2) CAS 02-23-02, Contaminated Areas (2); (3) CAS 02-23-03, Contaminated Berm; (4) CAS 02-23-10, Gourd-Amber Contamination Area; (5) CAS 02-23-11, Sappho Contamination Area; (6) CAS 02-23-12, Scuttle Contamination Area; (7) CAS 03-23-24, Seaweed B Contamination Area; (8) CAS 03-23-27, Adze Contamination Area; (9) CAS 03-23-28, Manzanas Contamination Area; (10) CAS 03-23-29, Truchas-Chamisal Contamination Area; (11) CAS 04-23-02, Atmospheric Test Site T4-a; (12) CAS 05-23-06, Atmospheric Test Site; (13) CAS 09-23-06, Mound of Contaminated Soil; (14) CAS 10-23-04, Atmospheric Test Site M-10; and (15) CAS 18-23-02, U-18d Crater (Sulky). Based on historical documentation, personnel interviews, site process knowledge, site visits, photographs, engineering drawings, field screening, analytical results, and the results of data quality objectives process (Section 3.0), closure in place with administrative controls or no further action will be implemented for CAU 107. CAU 107 closure activities will consist of verifying that the current postings required under Title 10 Code of Federal Regulations (CFR) Part 835 are in place and implementing use restrictions (URs) at two sites, CAS 03-23-29 and CAS 18-23-02. The current radiological postings combined with the URs are adequate administrative controls to limit site access and worker dose

  6. The Zig-zag Instability of Streamlined Bodies

    Science.gov (United States)

    Guillet, Thibault; Coux, Martin; Quere, David; Clanet, Christophe

    2017-11-01

    When a floating bluff body, like a sphere, impacts water with a vertical velocity, its trajectory is straight and the depth of its dive increases with its initial velocity. Even though we observe the same phenomenon at low impact speed for axisymmetric streamlined bodies, the trajectory is found to deviate from the vertical when the velocity overcomes a critical value. This instability results from a competition between the destabilizing torque of the lift and the stabilizing torque of the Archimede's force. Balancing these torques yields a prediction on the critical velocity above which the instability appears. This theoretical value is found to depend on the position of the gravity center of the projectile and predicts with a full agreement the behaviour observed in our different experiments. Project funded by DGA.

  7. Organizational changes and automation: Towards a customer-oriented automation: Part 3

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely. In this article attention is paid to the necessity of realizing an integrated computerized system, which, however, is not feasible at the moment. The second best alternative is to use various computerized systems, capable of two-way data exchange. Two viable approaches are discussed: (1) one operating system on which all automated systems within a company should run, or (2) a selective system linking on the basis of required speed information exchange. Option (2) offers more freedom of selecting the system. 2 figs

  8. Streamlining digital signal processing a tricks of the trade guidebook

    CERN Document Server

    2012-01-01

    Streamlining Digital Signal Processing, Second Edition, presents recent advances in DSP that simplify or increase the computational speed of common signal processing operations and provides practical, real-world tips and tricks not covered in conventional DSP textbooks. It offers new implementations of digital filter design, spectrum analysis, signal generation, high-speed function approximation, and various other DSP functions. It provides:Great tips, tricks of the trade, secrets, practical shortcuts, and clever engineering solutions from seasoned signal processing professionalsAn assortment.

  9. The Automated Aircraft Rework System (AARS): A system integration approach

    Science.gov (United States)

    Benoit, Michael J.

    1994-01-01

    The Mercer Engineering Research Center (MERC), under contract to the United States Air Force (USAF) since 1989, has been actively involved in providing the Warner Robins Air Logistics Center (WR-ALC) with a robotic workcell designed to perform rework automated defastening and hole location/transfer operations on F-15 wings. This paper describes the activities required to develop and implement this workcell, known as the Automated Aircraft Rework System (AARS). AARS is scheduled to be completely installed and in operation at WR-ALC by September 1994.

  10. Physics of automated driving in framework of three-phase traffic theory.

    Science.gov (United States)

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  11. Physics of automated driving in framework of three-phase traffic theory

    Science.gov (United States)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  12. Automated approach to nuclear facility safeguards effectiveness evaluation

    International Nuclear Information System (INIS)

    1977-01-01

    Concern over the security of nuclear facilities has generated a need for a reliable, time efficient, and easily applied method of evaluating the effectiveness of safeguards systems. Such an evaluation technique could be used (1) by the Nuclear Regulatory Commission to evaluate a licensee's proposal, (2) to assess the security status of a system, or (3) to design and/or upgrade nuclear facilities. The technique should be capable of starting with basic information, such as the facility layout and performance parameters for physical protection components, and analyzing that information so that a reliable overall facility evaluation is obtained. Responding to this expressed need, an automated approach to facility safeguards effectiveness evaluation has been developed. This procedure consists of a collection of functional modules for facility characterization, critical path generation, and path evaluation combined into a continuous stream of operations. The technique has been implemented on an interactive computer-timesharing system and makes use of computer graphics for the handling and presentation of information. Using this technique a thorough facility evaluation can be made by systematically varying parameters that characterize the physical protection components of a facility according to changes in perceived adversary attributes and strategy, environmental conditions, and site status

  13. Filaments in curved streamlines: rapid formation of Staphylococcus aureus biofilm streamers

    International Nuclear Information System (INIS)

    Kevin Kim, Minyoung; Drescher, Knut; Shun Pak, On; Stone, Howard A; Bassler, Bonnie L

    2014-01-01

    Biofilms are surface-associated conglomerates of bacteria that are highly resistant to antibiotics. These bacterial communities can cause chronic infections in humans by colonizing, for example, medical implants, heart valves, or lungs. Staphylococcus aureus, a notorious human pathogen, causes some of the most common biofilm-related infections. Despite the clinical importance of S. aureus biofilms, it remains mostly unknown how physical effects, in particular flow, and surface structure influence biofilm dynamics. Here we use model microfluidic systems to investigate how environmental factors, such as surface geometry, surface chemistry, and fluid flow affect biofilm development of S. aureus. We discovered that S. aureus rapidly forms flow-induced, filamentous biofilm streamers, and furthermore if surfaces are coated with human blood plasma, streamers appear within minutes and clog the channels more rapidly than if the channels are uncoated. To understand how biofilm streamer filaments reorient in flows with curved streamlines to bridge the distances between corners, we developed a mathematical model based on resistive force theory of slender filaments. Understanding physical aspects of biofilm formation of S. aureus may lead to new approaches for interrupting biofilm formation of this pathogen. (paper)

  14. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 408: Bomblet Target Area Tonopah Test Range (TTR), Nevada, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Mark Krauss

    2010-03-01

    This Streamlined Approach for Environmental Restoration Plan addresses the actions needed to achieve closure of Corrective Action Unit (CAU) 408, Bomblet Target Area (TTR). Corrective Action Unit 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order. Corrective Action Unit 408 comprises Corrective Action Site TA-55-002-TAB2, Bomblet Target Areas. Clean closure of CAU 408 will be accomplished by removal of munitions and explosives of concern within seven target areas and potential disposal pits. The target areas were used to perform submunitions related tests for the U.S. Department of Energy (DOE). The scope of CAU 408 is limited to submunitions released from DOE activities. However, it is recognized that the presence of other types of unexploded ordnance and munitions may be present within the target areas due to the activities of other government organizations. The CAU 408 closure activities consist of: • Clearing bomblet target areas within the study area. • Identifying and remediating disposal pits. • Collecting verification samples. • Performing radiological screening of soil. • Removing soil containing contaminants at concentrations above the action levels. Based on existing information, contaminants of potential concern at CAU 408 include unexploded submunitions, explosives, Resource Conservation Recovery Act metals, and depleted uranium. Contaminants are not expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results.

  15. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 408: Bomblet Target Area Tonopah Test Range (TTR), Nevada, Revision 1

    International Nuclear Information System (INIS)

    Krauss, Mark

    2010-01-01

    This Streamlined Approach for Environmental Restoration Plan addresses the actions needed to achieve closure of Corrective Action Unit (CAU) 408, Bomblet Target Area (TTR). Corrective Action Unit 408 is located at the Tonopah Test Range and is currently listed in Appendix III of the Federal Facility Agreement and Consent Order. Corrective Action Unit 408 comprises Corrective Action Site TA-55-002-TAB2, Bomblet Target Areas. Clean closure of CAU 408 will be accomplished by removal of munitions and explosives of concern within seven target areas and potential disposal pits. The target areas were used to perform submunitions related tests for the U.S. Department of Energy (DOE). The scope of CAU 408 is limited to submunitions released from DOE activities. However, it is recognized that the presence of other types of unexploded ordnance and munitions may be present within the target areas due to the activities of other government organizations. The CAU 408 closure activities consist of: (1) Clearing bomblet target areas within the study area. (2) Identifying and remediating disposal pits. (3) Collecting verification samples. (4) Performing radiological screening of soil. (5) Removing soil containing contaminants at concentrations above the action levels. Based on existing information, contaminants of potential concern at CAU 408 include unexploded submunitions, explosives, Resource Conservation Recovery Act metals, and depleted uranium. Contaminants are not expected to be present in the soil at concentrations above the action levels; however, this will be determined by radiological surveys and verification sample results.

  16. Achieving Lights-Out Operation of SMAP Using Ground Data System Automation

    Science.gov (United States)

    Sanders, Antonio

    2013-01-01

    The approach used in the SMAP ground data system to provide reliable, automated capabilities to conduct unattended operations has been presented. The impacts of automation on the ground data system architecture were discussed, including the three major automation patterns identified for SMAP and how these patterns address the operations use cases. The architecture and approaches used by SMAP will set the baseline for future JPL Earth Science missions.

  17. Automated Parallel Capillary Electrophoretic System

    Science.gov (United States)

    Li, Qingbo; Kane, Thomas E.; Liu, Changsheng; Sonnenschein, Bernard; Sharer, Michael V.; Kernan, John R.

    2000-02-22

    An automated electrophoretic system is disclosed. The system employs a capillary cartridge having a plurality of capillary tubes. The cartridge has a first array of capillary ends projecting from one side of a plate. The first array of capillary ends are spaced apart in substantially the same manner as the wells of a microtitre tray of standard size. This allows one to simultaneously perform capillary electrophoresis on samples present in each of the wells of the tray. The system includes a stacked, dual carousel arrangement to eliminate cross-contamination resulting from reuse of the same buffer tray on consecutive executions from electrophoresis. The system also has a gel delivery module containing a gel syringe/a stepper motor or a high pressure chamber with a pump to quickly and uniformly deliver gel through the capillary tubes. The system further includes a multi-wavelength beam generator to generate a laser beam which produces a beam with a wide range of wavelengths. An off-line capillary reconditioner thoroughly cleans a capillary cartridge to enable simultaneous execution of electrophoresis with another capillary cartridge. The streamlined nature of the off-line capillary reconditioner offers the advantage of increased system throughput with a minimal increase in system cost.

  18. Quantum mechanical streamlines. I - Square potential barrier

    Science.gov (United States)

    Hirschfelder, J. O.; Christoph, A. C.; Palke, W. E.

    1974-01-01

    Exact numerical calculations are made for scattering of quantum mechanical particles hitting a square two-dimensional potential barrier (an exact analog of the Goos-Haenchen optical experiments). Quantum mechanical streamlines are plotted and found to be smooth and continuous, to have continuous first derivatives even through the classical forbidden region, and to form quantized vortices around each of the nodal points. A comparison is made between the present numerical calculations and the stationary wave approximation, and good agreement is found between both the Goos-Haenchen shifts and the reflection coefficients. The time-independent Schroedinger equation for real wavefunctions is reduced to solving a nonlinear first-order partial differential equation, leading to a generalization of the Prager-Hirschfelder perturbation scheme. Implications of the hydrodynamical formulation of quantum mechanics are discussed, and cases are cited where quantum and classical mechanical motions are identical.

  19. Streamline topologies near simple degenerate critical points in two-dimensional flow away from boundaries

    DEFF Research Database (Denmark)

    Brøns, Morten; Hartnack, Johan Nicolai

    1998-01-01

    Streamline patterns and their bifurcations in two-dimensional incompressible flow are investigated from a topological point of view. The velocity field is expanded at a point in the fluid, and the expansion coefficients are considered as bifurcation parameters. A series of non-linear coordinate c...

  20. Streamline topologies near simple degenerate critical points in two-dimensional flow away from boundaries

    DEFF Research Database (Denmark)

    Brøns, Morten; Hartnack, Johan Nicolai

    1999-01-01

    Streamline patterns and their bifurcations in two-dimensional incompressible flow are investigated from a topological point of view. The velocity field is expanded at a point in the fluid, and the expansion coefficients are considered as bifurcation parameters. A series of nonlinear coordinate ch...

  1. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  2. 77 FR 50691 - Request for Information (RFI): Guidance on Data Streamlining and Reducing Undue Reporting Burden...

    Science.gov (United States)

    2012-08-22

    .... Attention: HIV Data Streamlining. FOR FURTHER INFORMATION CONTACT: Andrew D. Forsyth Ph.D. or Vera... of HIV/AIDS programs that vary in their specifications (e.g., numerators, denominators, time frames...

  3. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    Science.gov (United States)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  4. Fast and accurate approaches for large-scale, automated mapping of food diaries on food composition tables

    DEFF Research Database (Denmark)

    Lamarine, Marc; Hager, Jörg; Saris, Wim H M

    2018-01-01

    the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching). The second used a machine learning approach (C5.0 classifier) combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English...... not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names). Our approaches have been implemented as R packages...... and are freely available from GitHub. Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We demonstrate that both high precision and recall can be achieved. Our solutions can be used with any FCT and do not require any programming background...

  5. Investing in the Future: Automation Marketplace 2009

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    In a year where the general economy presented enormous challenges, libraries continued to make investments in automation, especially in products that help improve what and how they deliver to their end users. Access to electronic content remains a key driver. In response to anticipated needs for new approaches to library automation, many companies…

  6. Streamlining the process: A strategy for making NEPA work better and cost less

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, R.P.; Hansen, J.D. [Hansen Environmental Consultants, Englewood, CO (United States); Wolff, T.A. [Sandia National Labs., Albuquerque, NM (United States)

    1998-05-01

    When the National Environmental Policy Act (NEPA) was enacted in 1969, neither Congress nor the Federal Agencies affected anticipated that implementation of the NEPA process would result in the intolerable delays, inefficiencies, duplication of effort, commitments of excessive financial and personnel resources, and bureaucratic gridlock that have become institutionalized. The 1975 Council on Environmental Quality (CEQ) regulations, which were intended to make the NEPA process more efficient and more useful to decision makers and the public, have either been largely ignored or unintentionally subverted. Agency policy mandates, like those of former Secretary of Energy Hazel R. O`Leary, to ``make NEPA work better and cost less`` have, so far, been disappointingly ineffectual. Federal Agencies have reached the point where almost every constituent of the NEPA process must be subjected to crisis management. This paper focuses on a ten-point strategy for streamlining the NEPA process in order to achieve the Act`s objectives while easing the considerable burden on agencies, the public, and the judicial system. How the ten points are timed and implemented is critical to any successful streamlining.

  7. Streamlined library programming how to improve services and cut costs

    CERN Document Server

    Porter-Reynolds, Daisy

    2014-01-01

    In their roles as community centers, public libraries offer many innovative and appealing programs; but under current budget cuts, library resources are stretched thin. With slashed budgets and limited staff hours, what can libraries do to best serve their publics? This how-to guide provides strategies for streamlining library programming in public libraries while simultaneously maintaining-or even improving-quality delivery. The wide variety of principles and techniques described can be applied on a selective basis to libraries of all sizes. Based upon the author's own extensive experience as

  8. A Noble Approach of Process Automation in Galvanized Nut, Bolt Manufacturing Industry

    Directory of Open Access Journals (Sweden)

    Akash Samanta

    2012-05-01

    Full Text Available Corrosion costs money”, The Columbus battle institute estimates that corrosion costs Americans more than $ 220 billion annually, about 4.3% of the gross natural product [1].Now a days due to increase of pollution, the rate of corrosion is also increasing day-by-day mainly in India, so, to save the steel structures, galvanizing is the best and the simplest solution. Due to this reason galvanizing industries are increasing day-by-day since mid of 1700s.Galvanizing is a controlled metallurgical combination of zinc and steel that can provide a corrosion resistance in a wide variety of environment. In fact, the galvanized metal corrosion resistance factor can be some 70 to 80 times greater that the base metal material. Keeping in mind the importance of this industry, a noble approach of process automation in galvanized nut-bolt  manufacturing plant is presented here as nuts and bolts are the prime ingredient of any structure. In this paper the main objectives of any industry like survival, profit maximization, profit satisfying and sales growth are fulfilled. Furthermore the environmental aspects i.e. pollution control and energy saving are also considered in this paper. The whole automation process is done using programmable logic controller (PLC which has number of unique advantages like being faster, reliable, requires less maintenance and reprogrammable. The whole system has been designed and tested using GE, FANUC PLC.

  9. West Virginia Peer Exchange : Streamlining Highway Safety Improvement Program Project Delivery - An RSPCB Peer Exchange

    Science.gov (United States)

    2014-09-01

    The West Virginia Division of Highways (WV DOH) hosted a Peer Exchange to share information and experiences for streamlining Highway Safety Improvement Program (HSIP) project delivery. The event was held September 23 to 24, 2014 in Charleston, West V...

  10. A Federated Enterprise Architecture and MBSE Modeling Framework for Integrating Design Automation into a Global PLM Approach

    OpenAIRE

    Vosgien , Thomas; Rigger , Eugen; Schwarz , Martin; Shea , Kristina

    2017-01-01

    Part 1: PLM Maturity, Implementation and Adoption; International audience; PLM and Design Automation (DA) are two interdependent and necessary approaches to increase the performance and efficiency of product development processes. Often, DA systems’ usability suffers due to a lack of integration in industrial business environments stemming from the independent consideration of PLM and DA. This article proposes a methodological and modeling framework for developing and deploying DA solutions w...

  11. Automation of P-3 Simulations to Improve Operator Workload

    Science.gov (United States)

    2012-09-01

    Training GBE Group Behavior Engine GCC Geocentric Coordinates GCS Global Coordinate System GUI Graphical User Interface xiv HLA High...this thesis and because they each have a unique approach to solving the problem of entity behavior automation. A. DISCOVERY MACHINE The United States...from the operators and can be automated in JSAF using the mental simulation approach . Two trips were conducted to visit the Naval Warfare

  12. A streamlined DNA tool for global identification of heavily exploited coastal shark species (genus Rhizoprionodon.

    Directory of Open Access Journals (Sweden)

    Danillo Pinhal

    Full Text Available Obtaining accurate species-specific landings data is an essential step toward achieving sustainable shark fisheries. Globally distributed sharpnose sharks (genus Rhizoprionodon exhibit life-history characteristics (rapid growth, early maturity, annual reproduction that suggests that they could be fished in a sustainable manner assuming an investment in monitoring, assessment and careful management. However, obtaining species-specific landings data for sharpnose sharks is problematic because they are morphologically very similar to one another. Moreover, sharpnose sharks may also be confused with other small sharks (either small species or juveniles of large species once they are processed (i.e., the head and fins are removed. Here we present a highly streamlined molecular genetics approach based on seven species-specific PCR primers in a multiplex format that can simultaneously discriminate body parts from the seven described sharpnose shark species commonly occurring in coastal fisheries worldwide. The species-specific primers are based on nucleotide sequence differences among species in the nuclear ribosomal internal transcribed spacer 2 locus (ITS2. This approach also distinguishes sharpnose sharks from a wide range of other sharks (52 species and can therefore assist in the regulation of coastal shark fisheries around the world.

  13. Use of Vortex Generators to Reduce Distortion for Mach 1.6 Streamline-Traced Supersonic Inlets

    Science.gov (United States)

    Baydar, Ezgihan; Lu, Frank; Slater, John W.; Trefny, Chuck

    2016-01-01

    Reduce the total pressure distortion at the engine-fan face due to low-momentum flow caused by the interaction of an external terminal shock at the turbulent boundary layer along a streamline-traced external-compression (STEX) inlet for Mach 1.6.

  14. Automated, high accuracy classification of Parkinsonian disorders: a pattern recognition approach.

    Directory of Open Access Journals (Sweden)

    Andre F Marquand

    Full Text Available Progressive supranuclear palsy (PSP, multiple system atrophy (MSA and idiopathic Parkinson's disease (IPD can be clinically indistinguishable, especially in the early stages, despite distinct patterns of molecular pathology. Structural neuroimaging holds promise for providing objective biomarkers for discriminating these diseases at the single subject level but all studies to date have reported incomplete separation of disease groups. In this study, we employed multi-class pattern recognition to assess the value of anatomical patterns derived from a widely available structural neuroimaging sequence for automated classification of these disorders. To achieve this, 17 patients with PSP, 14 with IPD and 19 with MSA were scanned using structural MRI along with 19 healthy controls (HCs. An advanced probabilistic pattern recognition approach was employed to evaluate the diagnostic value of several pre-defined anatomical patterns for discriminating the disorders, including: (i a subcortical motor network; (ii each of its component regions and (iii the whole brain. All disease groups could be discriminated simultaneously with high accuracy using the subcortical motor network. The region providing the most accurate predictions overall was the midbrain/brainstem, which discriminated all disease groups from one another and from HCs. The subcortical network also produced more accurate predictions than the whole brain and all of its constituent regions. PSP was accurately predicted from the midbrain/brainstem, cerebellum and all basal ganglia compartments; MSA from the midbrain/brainstem and cerebellum and IPD from the midbrain/brainstem only. This study demonstrates that automated analysis of structural MRI can accurately predict diagnosis in individual patients with Parkinsonian disorders, and identifies distinct patterns of regional atrophy particularly useful for this process.

  15. Streamlined approach for environmental restoration (SAFER) plan for corrective action unit 412: clean slate I plutonium dispersion (TTR) tonopah test range, Nevada, revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick K.

    2015-04-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 412. CAU 412 is located on the Tonopah Test Range and consists of a single corrective action site (CAS), TA-23-01CS, Pu Contaminated Soil. There is sufficient information and historical documentation from previous investigations and the 1997 interim corrective action to recommend closure of CAU 412 using the SAFER process. Based on existing data, the presumed corrective action for CAU 412 is clean closure. However, additional data will be obtained during a field investigation to document and verify the adequacy of existing information and determine whether the CAU 412 closure objectives have been achieved. This SAFER Plan provides the methodology to gather the necessary information for closing the CAU.The following summarizes the SAFER activities that will support the closure of CAU 412:• Collect environmental samples from designated target populations to confirm or disprove the presence of contaminants of concern (COCs) as necessary to supplement existing information.• If no COCs are present, establish clean closure as the corrective action. • If COCs are present, the extent of contamination will be defined and further corrective actions will be evaluated with the stakeholders (NDEP, USAF).• Confirm the preferred closure option is sufficient to protect human health and the environment.

  16. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 425: Area 9 Main Lake Construction Debris Disposal Area, Tonopah Test Range, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    K. B. Campbell

    2002-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 425, Area 9 Main Lake Construction Debris Disposal Area. This CAU is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO, 1996). This site will be cleaned up under the SAFER process since the volume of waste exceeds the 23 cubic meters (m(sup 3)) (30 cubic yards[yd(sup 3)]) limit established for housekeeping sites. CAU 425 is located on the Tonopah Test Range (TTR) and consists of one Corrective Action Site (CAS) 09-08-001-TA09, Construction Debris Disposal Area (Figure 1). CAS 09-08-001-TA09 is an area that was used to collect debris from various projects in and around Area 9. The site is located approximately 81 meters (m) (265 feet[ft]) north of Edwards Freeway northeast of Main Lake on the TTR. The site is composed of concrete slabs with metal infrastructure, metal rebar, wooden telephone poles, and concrete rubble from the Hard Target and early Tornado Rocket sled tests. Other items such as wood scraps, plastic pipes, soil, and miscellaneous nonhazardous items have also been identified in the debris pile. It is estimated that this site contains approximately 2280 m(sup 3) (3000 yd(sup 3)) of construction-related debris

  17. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  18. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 326: Areas 6 and 27 Release Sites, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    A. T. Urbon

    2001-09-01

    This Streamlined Approach for Environmental Restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 326, Areas 6 and 27 Release Sites. This CAU is currently listed in the January 2001, Appendix III of the Federal Facilities Agreement and Consent Order (FFACO) (FFACO, 1996). CAU 326 is located on the Nevada Test Site (NTS) and consists of the following four Corrective Action Sites (CASS) (Figure 1): CAS 06-25-01--Is a rupture in an underground pipe that carried heating oil (diesel) from the underground heating oil tank (Tank 6-CP-1) located to the west of Building CP-70 to the boiler in Building CP-1 in the Area 6 Control Point (CP) compound. CAS 06-25-02--A heating oil spill that is a result of overfilling an underground heating oil tank (Tank 6-DAF-5) located at the Area 6 Device Assembly Facility (DAF). CAS 06-25-04--A release of waste oil that occurred while removing used oil to from Tank 6-619-4. Tank 6-619-4 is located northwest of Building 6-619 at the Area 6 Gas Station. CAS 27-25-01--Consists of an excavation that was created in an attempt to remove impacted stained soil from the Site Maintenance Yard in Area 27. Approximately 53.5 cubic meters (m{sup 3}) (70 cubic yards [yd{sup 3}]) of soil impacted by total petroleum hydrocarbons (TPH) and polychlorinated biphenyls (PCBs) was excavated before the excavation activities were halted. The excavation activities were stopped because the volume of impacted soil exceeded estimated quantities and budget.

  19. MLPAinter for MLPA interpretation: an integrated approach for the analysis, visualisation and data management of Multiplex Ligation-dependent Probe Amplification

    Directory of Open Access Journals (Sweden)

    Morreau Hans

    2010-01-01

    Full Text Available Abstract Background Multiplex Ligation-Dependent Probe Amplification (MLPA is an application that can be used for the detection of multiple chromosomal aberrations in a single experiment. In one reaction, up to 50 different genomic sequences can be analysed. For a reliable work-flow, tools are needed for administrative support, data management, normalisation, visualisation, reporting and interpretation. Results Here, we developed a data management system, MLPAInter for MLPA interpretation, that is windows executable and has a stand-alone database for monitoring and interpreting the MLPA data stream that is generated from the experimental setup to analysis, quality control and visualisation. A statistical approach is applied for the normalisation and analysis of large series of MLPA traces, making use of multiple control samples and internal controls. Conclusions MLPAinter visualises MLPA data in plots with information about sample replicates, normalisation settings, and sample characteristics. This integrated approach helps in the automated handling of large series of MLPA data and guarantees a quick and streamlined dataflow from the beginning of an experiment to an authorised report.

  20. Computer automation and artificial intelligence

    International Nuclear Information System (INIS)

    Hasnain, S.B.

    1992-01-01

    Rapid advances in computing, resulting from micro chip revolution has increased its application manifold particularly for computer automation. Yet the level of automation available, has limited its application to more complex and dynamic systems which require an intelligent computer control. In this paper a review of Artificial intelligence techniques used to augment automation is presented. The current sequential processing approach usually adopted in artificial intelligence has succeeded in emulating the symbolic processing part of intelligence, but the processing power required to get more elusive aspects of intelligence leads towards parallel processing. An overview of parallel processing with emphasis on transputer is also provided. A Fuzzy knowledge based controller for amination drug delivery in muscle relaxant anesthesia on transputer is described. 4 figs. (author)

  1. Automated analysis of autoradiographic imagery

    International Nuclear Information System (INIS)

    Bisignani, W.T.; Greenhouse, S.C.

    1975-01-01

    A research programme is described which has as its objective the automated characterization of neurological tissue regions from autoradiographs by utilizing hybrid-resolution image processing techniques. An experimental system is discussed which includes raw imagery, scanning an digitizing equipments, feature-extraction algorithms, and regional characterization techniques. The parameters extracted by these algorithms are presented as well as the regional characteristics which are obtained by operating on the parameters with statistical sampling techniques. An approach is presented for validating the techniques and initial experimental results are obtained from an anlysis of an autoradiograph of a region of the hypothalamus. An extension of these automated techniques to other biomedical research areas is discussed as well as the implications of applying automated techniques to biomedical research problems. (author)

  2. Automated element identification for EDS spectra evaluation using quantification and integrated spectra simulation approaches

    International Nuclear Information System (INIS)

    Eggert, F

    2010-01-01

    This work describes first real automated solution for qualitative evaluation of EDS spectra in X-ray microanalysis. It uses a combination of integrated standardless quantitative evaluation, computation of analytical errors to a final uncertainty, and parts of recently developed simulation approaches. Multiple spectra reconstruction assessments and peak searches of the residual spectrum are powerful enough to solve the qualitative analytical question automatically for totally unknown specimens. The integrated quantitative assessment is useful to improve the confidence of the qualitative analysis. Therefore, the qualitative element analysis becomes a part of integrated quantitative spectrum evaluation, where the quantitative results are used to iteratively refine element decisions, spectrum deconvolution, and simulation steps.

  3. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  4. Topology of streamlines and vorticity contours for two - dimensional flows

    DEFF Research Database (Denmark)

    Andersen, Morten

    on the vortex filament by the localised induction approximation the stream function is slightly modified and an extra parameter is introduced. In this setting two new flow topologies arise, but not more than two critical points occur for any combination of the parameters. The analysis of the closed form show...... by a point vortex above a wall in inviscid fluid. There is no reason to a priori expect equivalent results of the three vortex definitions. However, the study is mainly motivated by the findings of Kudela & Malecha (Fluid Dyn. Res. 41, 2009) who find good agreement between the vorticity and streamlines...

  5. Less is More : Better Compliance and Increased Revenues by Streamlining Business Registration in Uganda

    OpenAIRE

    Sander, Cerstin

    2003-01-01

    A pilot of a streamlined business registration system in Entebbe, Uganda, reduced compliance costs for enterprises by 75 percent, raised registration numbers and fee revenue by 40 percent and reduced the cost of administering the system. It also reduced opportunities for corruption, improved relations between businesses and the local authorities and resulted in better compliance.

  6. Using process-oriented interfaces for solving the automation paradox in highly automated navy vessels

    NARCIS (Netherlands)

    Diggelen, J. van; Post, W.; Rakhorst, M.; Plasmeijer, R.; Staal, W. van

    2014-01-01

    This paper describes a coherent engineering method for developing high level human machine interaction within a highly automated environment consisting of sensors, actuators, automatic situation assessors and planning devices. Our approach combines ideas from cognitive work analysis, cognitive

  7. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  8. An automated approach to the design of decision tree classifiers

    Science.gov (United States)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  9. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  10. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  11. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  12. Intraoperative Cochlear Implant Device Testing Utilizing an Automated Remote System: A Prospective Pilot Study.

    Science.gov (United States)

    Lohmann, Amanda R; Carlson, Matthew L; Sladen, Douglas P

    2018-03-01

    120 177.0 μV, SD 11.57; p value automated system is feasible. This system may be useful for cochlear implant programs with limited audiology support or for programs looking to streamline intraoperative device testing protocols. Future studies with larger patient enrollment are required to validate these promising, but preliminary, findings.

  13. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    in operating situations involving a procedural type of activity was precise. The measure of operators' trust in the automation yielded no significant results in relation to the procedural type of activity scenarios. Based on the experience and outcome of this experiment, a preliminary approach for defining automation types is proposed. In addition, a set of hypotheses about how automation influences operator performance is derived from the outcome of the experiment, and a set of recommendations for future studies is offered (author) (ml)

  14. Streamlining air import operations by trade facilitation measures

    Directory of Open Access Journals (Sweden)

    Yuri da Cunha Ferreira

    2017-12-01

    Full Text Available Global operations are subject to considerable uncertainties. Due to the Trade Facilitation Agreement that became effective in February 2017, the study of measures to streamline customs controls is urgent. This study aims to assess the impact of trade facilitation measures on import flows. An experimental study was performed in the largest cargo airport in South America through discrete-event simulation and design of experiments. Operation impacts of three trade facilitation measures are assessed on import flow by air. We shed light in the following trade facilitation measures: the use of X-ray equipment for physical inspection; increase of the number of qualified companies in the trade facilitation program; performance targets for customs officials. All trade facilitation measures used indicated potential to provide more predictability, cost savings, time reduction, and increase in security in international supply chain.

  15. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 326: Areas 6 and 27 Release Sites, Nevada Test Site, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    A. T. Urbon

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 326, Areas 6 and 27 Release Sites. This CAU is currently listed in the January 2001, Appendix III of the Federal Facilities Agreement and Consent Order (FFACO) (FFACO, 1996). CAU 326 is located on the Nevada Test Site (NTS) and consists of the following four Corrective Action Sites (CASS) (Figure 1): CAS 06-25-01-Is a rupture in an underground pipe that carried heating oil (diesel) from the underground heating oil tank (Tank 6-CP-1) located to the west of Building CP-70 to the boiler in Building CP-1 in the Area 6 Control Point (CP) compound. CAS 06-25-02-A heating oil spill that is a result of overfilling an underground heating oil tank (Tank 6-DAF-5) located at the Area 6 Device Assembly Facility (DAF). CAS 06-25-04-A release of waste oil that occurred while removing used oil to from Tank 6-619-4. Tank 6-619-4 is located northwest of Building 6-619 at the Area 6 Gas Station. CAS 27-25-01-Consists of an excavation that was created in an attempt to remove impacted stained soil from the Site Maintenance Yard in Area 27. Approximately 53.5 cubic meters (m(sup 3)) (70 cubic yards[yd(sup 3)]) of soil impacted by total petroleum hydrocarbons (TPH) and polychlorinated biphenyls (PCBs) was excavated before the excavation activities were halted. The excavation activities were stopped because the volume of impacted soil exceeded estimated quantities and budget

  16. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  17. Calculation of heat transfer in transversely stream-lined tube bundles with chess arrangement

    International Nuclear Information System (INIS)

    Migaj, V.K.

    1978-01-01

    A semiempirical theory of heat transfer in transversely stream-lined chess-board tube bundles has been developed. The theory is based on a single cylinder model and involves external flow parameter evaluation on the basis of the solidification principle of a vortex zone. The effect of turbulence is estimated according to experimental results. The method is extended to both average and local heat transfer coefficients. Comparison with experiment shows satisfactory agreement

  18. Automated design of analog and high-frequency circuits a computational intelligence approach

    CERN Document Server

    Liu, Bo; Fernández, Francisco V

    2014-01-01

    Computational intelligence techniques are becoming more and more important for automated problem solving nowadays. Due to the growing complexity of industrial applications and the increasingly tight time-to-market requirements, the time available for thorough problem analysis and development of tailored solution methods is decreasing. There is no doubt that this trend will continue in the foreseeable future. Hence, it is not surprising that robust and general automated problem solving methods with satisfactory performance are needed.

  19. Role of automation in the ACRV operations

    Science.gov (United States)

    Sepahban, S. F.

    1992-01-01

    The Assured Crew Return Vehicle (ACRV) will provide the Space Station Freedom with contingency means of return to earth (1) of one disabled crew member during medical emergencies, (2) of all crew members in case of accidents or failures of SSF systems, and (3) in case of interruption of the Space Shuttle flights. A wide range of vehicle configurations and system approaches are currently under study. The Program requirements focus on minimizing life cycle costs by ensuring simple operations, built-in reliability and maintainability. The ACRV philosophy of embedded operations is based on maximum use of existing facilities, resources and processes, while minimizing the interfaces and impacts to the Space Shuttle and Freedom programs. A preliminary integrated operations concept based on this philosophy and covering the ground, flight, mission support, and landing and recovery operations has been produced. To implement the ACRV operations concept, the underlying approach has been to rely on vehicle autonomy and automation, to the extent possible. Candidate functions and processes which may benefit from current or near-term automation and robotics technologies are identified. These include, but are not limited to, built-in automated ground tests and checkouts; use of the Freedom and the Orbiter remote manipulator systems, for ACRV berthing; automated passive monitoring and performance trend analysis, and periodic active checkouts during dormant periods. The major ACRV operations concept issues as they relate to the use of automation are discussed.

  20. Silhouette-based approach of 3D image reconstruction for automated image acquisition using robotic arm

    Science.gov (United States)

    Azhar, N.; Saad, W. H. M.; Manap, N. A.; Saad, N. M.; Syafeeza, A. R.

    2017-06-01

    This study presents the approach of 3D image reconstruction using an autonomous robotic arm for the image acquisition process. A low cost of the automated imaging platform is created using a pair of G15 servo motor connected in series to an Arduino UNO as a main microcontroller. Two sets of sequential images were obtained using different projection angle of the camera. The silhouette-based approach is used in this study for 3D reconstruction from the sequential images captured from several different angles of the object. Other than that, an analysis based on the effect of different number of sequential images on the accuracy of 3D model reconstruction was also carried out with a fixed projection angle of the camera. The effecting elements in the 3D reconstruction are discussed and the overall result of the analysis is concluded according to the prototype of imaging platform.

  1. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  2. Ontology-Based Device Descriptions and Device Repository for Building Automation Devices

    Directory of Open Access Journals (Sweden)

    Dibowski Henrik

    2011-01-01

    Full Text Available Device descriptions play an important role in the design and commissioning of modern building automation systems and help reducing the design time and costs. However, all established device descriptions are specialized for certain purposes and suffer from several weaknesses. This hinders a further design automation, which is strongly needed for the more and more complex building automation systems. To overcome these problems, this paper presents novel Ontology-based Device Descriptions (ODDs along with a layered ontology architecture, a specific ontology view approach with virtual properties, a generic access interface, a triple store-based database backend, and a generic search mask GUI with underlying query generation algorithm. It enables a formal, unified, and extensible specification of building automation devices, ensures their comparability, and facilitates a computer-enabled retrieval, selection, and interoperability evaluation, which is essential for an automated design. The scalability of the approach to several ten thousand devices is demonstrated.

  3. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    CERN Document Server

    Acciarri, R.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2017-01-01

    The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the...

  4. Impact of automation on mass spectrometry.

    Science.gov (United States)

    Zhang, Yan Victoria; Rockwood, Alan

    2015-10-23

    Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Safeguards Automated Facility Evaluation (SAFE) methodology

    International Nuclear Information System (INIS)

    Chapman, L.D.; Grady, L.M.; Bennett, H.A.; Sasser, D.W.; Engi, D.

    1978-08-01

    An automated approach to facility safeguards effectiveness evaluation has been developed. This automated process, called Safeguards Automated Facility Evaluation (SAFE), consists of a collection of a continuous stream of operational modules for facility characterization, the selection of critical paths, and the evaluation of safeguards effectiveness along these paths. The technique has been implemented on an interactive computer time-sharing system and makes use of computer graphics for the processing and presentation of information. Using this technique, a comprehensive evaluation of a safeguards system can be provided by systematically varying the parameters that characterize the physical protection components of a facility to reflect the perceived adversary attributes and strategy, environmental conditions, and site operational conditions. The SAFE procedure has broad applications in the nuclear facility safeguards field as well as in the security field in general. Any fixed facility containing valuable materials or components to be protected from theft or sabotage could be analyzed using this same automated evaluation technique

  6. Streamlined islands and the English Channel megaflood hypothesis

    Science.gov (United States)

    Collier, J. S.; Oggioni, F.; Gupta, S.; García-Moreno, D.; Trentesaux, A.; De Batist, M.

    2015-12-01

    Recognising ice-age catastrophic megafloods is important because they had significant impact on large-scale drainage evolution and patterns of water and sediment movement to the oceans, and likely induced very rapid, short-term effects on climate. It has been previously proposed that a drainage system on the floor of the English Channel was initiated by catastrophic flooding in the Pleistocene but this suggestion has remained controversial. Here we examine this hypothesis through an analysis of key landform features. We use a new compilation of multi- and single-beam bathymetry together with sub-bottom profiler data to establish the internal structure, planform geometry and hence origin of a set of 36 mid-channel islands. Whilst there is evidence of modern-day surficial sediment processes, the majority of the islands can be clearly demonstrated to be formed of bedrock, and are hence erosional remnants rather than depositional features. The islands display classic lemniscate or tear-drop outlines, with elongated tips pointing downstream, typical of streamlined islands formed during high-magnitude water flow. The length-to-width ratio for the entire island population is 3.4 ± 1.3 and the degree-of-elongation or k-value is 3.7 ± 1.4. These values are comparable to streamlined islands in other proven Pleistocene catastrophic flood terrains and are distinctly different to values found in modern-day rivers. The island geometries show a correlation with bedrock type: with those carved from Upper Cretaceous chalk having larger length-to-width ratios (3.2 ± 1.3) than those carved into more mixed Paleogene terrigenous sandstones, siltstones and mudstones (3.0 ± 1.5). We attribute these differences to the former rock unit having a lower skin friction which allowed longer island growth to achieve minimum drag. The Paleogene islands, although less numerous than the Chalk islands, also assume more perfect lemniscate shapes. These lithologies therefore reached island

  7. An Elliptic PDE Approach for Shape Characterization

    Science.gov (United States)

    Haidar, Haissam; Bouix, Sylvain; Levitt, James; McCarley, Robert W.; Shenton, Martha E.; Soul, Janet S.

    2009-01-01

    This paper presents a novel approach to analyze the shape of anatomical structures. Our methodology is rooted in classical physics and in particular Poisson's equation, a fundamental partial differential equation [1]. The solution to this equation and more specifically its equipotential surfaces display properties that are useful for shape analysis. We present a numerical algorithm to calculate the length of streamlines formed by the gradient field of the solution to this equation for 2D and 3D objects. The length of the streamlines along the equipotential surfaces was used to build a new function which can characterize the shape of objects. We illustrate our method on 2D synthetic and natural shapes as well as 3D medical data. PMID:17271986

  8. Monte Carlo shielding analyses using an automated biasing procedure

    International Nuclear Information System (INIS)

    Tang, J.S.; Hoffman, T.J.

    1988-01-01

    A systematic and automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete ordinates calculation are used to generate biasing parameters for a Monte Carlo calculation. The entire procedure of adjoint calculation, biasing parameters generation, and Monte Carlo calculation has been automated. The automated biasing procedure has been applied to several realistic deep-penetration shipping cask problems. The results obtained for neutron and gamma-ray transport indicate that with the automated biasing procedure Monte Carlo shielding calculations of spent-fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost

  9. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  10. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    . The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)

  11. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    . The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)

  12. Using Publish-Subscribe Messaging for System Status and Automation

    Science.gov (United States)

    Smith, Danford S.

    2015-01-01

    The NASA Goddard Mission Services Evolution Center (GMSEC) system is a message-based plug-and-play open system architecture used in many of NASA mission operations centers. This presentation will focus on the use of GMSEC standard messages to report and analyze the status of a system and enable the automation of the system's components. In GMSEC systems, each component reports its status using a keep-alive message and also publishes status and activities as log messages. In addition, the components can accept functional directive messages from the GMSEC message bus. Over the past several years, development teams have found ways to utilize these messages to create innovative display pages and increasingly sophisticated approaches to automation. This presentation will show the flexibility and value of the message-based approach to system awareness and automation.

  13. Towards automated construction of dependable software/hardware systems

    Energy Technology Data Exchange (ETDEWEB)

    Yakhnis, A.; Yakhnis, V. [Pioneer Technologies & Rockwell Science Center, Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on the automated construction of dependable computer architecture systems. The outline of this report is: examples of software/hardware systems; dependable systems; partial delivery of dependability; proposed approach; removing obstacles; advantages of the approach; criteria for success; current progress of the approach; and references.

  14. Patient and physician attitudes regarding risk and benefit in streamlined development programmes for antibacterial drugs: a qualitative analysis.

    Science.gov (United States)

    Holland, Thomas L; Mikita, Stephen; Bloom, Diane; Roberts, Jamie; McCall, Jonathan; Collyar, Deborah; Santiago, Jonas; Tiernan, Rosemary; Toerner, Joseph

    2016-11-10

    To explore patient, caregiver and physician perceptions and attitudes regarding the balance of benefit and risk in using antibacterial drugs developed through streamlined development processes. Semistructured focus groups and in-depth interviews were conducted to elicit perceptions and attitudes about the use of antibacterial drugs to treat multidrug-resistant infections. Participants were given background information about antibiotic resistance, streamlined drug development programmes and FDA drug approval processes. Audio recordings of focus groups/interviews were reviewed and quotes excerpted and categorised to identify key themes. Two primary stakeholder groups were engaged: one comprising caregivers, healthy persons and patients who had recovered from or were at risk of resistant infection (N=67; 11 focus groups); and one comprising physicians who treat resistant infections (N=23). Responses from focus groups/interviews indicated widespread awareness among patients/caregivers and physicians of the seriousness of the problem of antibacterial resistance. Both groups were willing to accept a degree of uncertainty regarding the balance of risk and benefit in a new therapy where a serious unmet need exists, but also expressed a desire for rigorous monitoring and rapid, transparent reporting of safety/effectiveness data. Both groups wanted to ensure that >1 physician had input on whether to treat patients with antibiotics developed through a streamlined process. Some patients/caregivers unfamiliar with exigencies of critical care suggested a relatively large multidisciplinary team, while physicians believed individual expert consultations would be preferable. Both groups agreed that careful oversight and stewardship of antibacterial drugs are needed to ensure patient safety, preserve efficacy and prevent abuse. Groups comprising patients/caregivers and physicians were aware of serious issues posed by resistant infections and the lack of effective antibacterial drug

  15. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    Science.gov (United States)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  16. Automated EEG sleep staging in the term-age baby using a generative modelling approach

    Science.gov (United States)

    Pillay, Kirubin; Dereymaeker, Anneleen; Jansen, Katrien; Naulaers, Gunnar; Van Huffel, Sabine; De Vos, Maarten

    2018-06-01

    Objective. We develop a method for automated four-state sleep classification of preterm and term-born babies at term-age of 38-40 weeks postmenstrual age (the age since the last menstrual cycle of the mother) using multichannel electroencephalogram (EEG) recordings. At this critical age, EEG differentiates from broader quiet sleep (QS) and active sleep (AS) stages to four, more complex states, and the quality and timing of this differentiation is indicative of the level of brain development. However, existing methods for automated sleep classification remain focussed only on QS and AS sleep classification. Approach. EEG features were calculated from 16 EEG recordings, in 30 s epochs, and personalized feature scaling used to correct for some of the inter-recording variability, by standardizing each recording’s feature data using its mean and standard deviation. Hidden Markov models (HMMs) and Gaussian mixture models (GMMs) were trained, with the HMM incorporating knowledge of the sleep state transition probabilities. Performance of the GMM and HMM (with and without scaling) were compared, and Cohen’s kappa agreement calculated between the estimates and clinicians’ visual labels. Main results. For four-state classification, the HMM proved superior to the GMM. With the inclusion of personalized feature scaling, mean kappa (±standard deviation) was 0.62 (±0.16) compared to the GMM value of 0.55 (±0.15). Without feature scaling, kappas for the HMM and GMM dropped to 0.56 (±0.18) and 0.51 (±0.15), respectively. Significance. This is the first study to present a successful method for the automated staging of four states in term-age sleep using multichannel EEG. Results suggested a benefit in incorporating transition information using an HMM, and correcting for inter-recording variability through personalized feature scaling. Determining the timing and quality of these states are indicative of developmental delays in both preterm and term-born babies that may

  17. Development of an automated guided vehicle controller using a systems engineering approach

    Directory of Open Access Journals (Sweden)

    Ferreira, Tremaine

    2016-08-01

    Full Text Available Automated guided vehicles (AGVs are widely used for transporting materials in industry and commerce. In this research, an intelligent AGV-based material-handling system was developed using a model- based systems engineering (MBSE approach. The core of the AGV, the controller, was designed in the system modelling language environment using Visual Paradigm software, and then implemented in the hardware. As the result, the AGV’s complex tasks of material handling, navigation, and communication were successfully accomplished and tested in the real industrial environment. The developed AGV is capable of towing trolleys with a weight of up to 200kg at walking speed. The AGV can be incorporated into an intelligent material-handling system with multiple autonomous vehicles and work stations, thus providing flexibility and reconfigurability for the whole manufacturing system. Ergonomic and safety aspects were also considered in the design of the AGV. A comprehensive safety system that is compliant with industrial standards was implemented.

  18. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Bender, M.A.

    1976-01-01

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  19. USB port compatible virtual instrument based automation for x-ray diffractometer setup

    International Nuclear Information System (INIS)

    Jayapandian, J.; Sheela, O.K.; Mallika, R.; Thiruarul, A.; Purniah, B.

    2004-01-01

    Windows based virtual instrument (VI) programs in graphic language simplify the design automation in R and D laboratories. With minimal hardware and maximum support of software, the automation becomes easier and user friendly. A novel design approach for the automation of SIEMENS make x-ray diffractometer setup is described in this paper. The automation is achieved with an indigenously developed virtual instrument program in labVIEW ver.6.0 and with a simple hardware design using 89C2051 micro-controller compatible with PC's USB port for the total automation of the experiment. (author)

  20. Easy XMM-Newton Data Analysis with the Streamlined ABC Guide!

    Science.gov (United States)

    Valencic, Lynne A.; Snowden, Steven L.; Pence, William D.

    2016-01-01

    The US XMM-Newton GOF has streamlined the time-honored XMM-Newton ABC Guide, making it easier to find and use what users may need to analyze their data. It takes into account what type of data a user might have, if they want to reduce the data on their own machine or over the internet with Web Hera, and if they prefer to use the command window or a GUI. The GOF has also included an introduction to analyzing EPIC and RGS spectra, and PN Timing mode data. The guide is provided for free to students, educators, and researchers for educational and research purposes. Try it out at: http://heasarc.gsfc.nasa.gov/docs/xmm/sl/intro.html

  1. Assessment of tobacco smoke effects on neonatal cardiorespiratory control using a semi-automated processing approach.

    Science.gov (United States)

    Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy

    2018-05-10

    A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.

  2. An automated approach for segmentation of intravascular ultrasound images based on parametric active contour models

    International Nuclear Information System (INIS)

    Vard, Alireza; Jamshidi, Kamal; Movahhedinia, Naser

    2012-01-01

    This paper presents a fully automated approach to detect the intima and media-adventitia borders in intravascular ultrasound images based on parametric active contour models. To detect the intima border, we compute a new image feature applying a combination of short-term autocorrelations calculated for the contour pixels. These feature values are employed to define an energy function of the active contour called normalized cumulative short-term autocorrelation. Exploiting this energy function, the intima border is separated accurately from the blood region contaminated by high speckle noise. To extract media-adventitia boundary, we define a new form of energy function based on edge, texture and spring forces for the active contour. Utilizing this active contour, the media-adventitia border is identified correctly even in presence of branch openings and calcifications. Experimental results indicate accuracy of the proposed methods. In addition, statistical analysis demonstrates high conformity between manual tracing and the results obtained by the proposed approaches.

  3. Virtual commissioning of automated micro-optical assembly

    Science.gov (United States)

    Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian

    2015-02-01

    In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping

  4. VMware vSphere PowerCLI Reference Automating vSphere Administration

    CERN Document Server

    Dekens, Luc; Sizemore, Glenn; van Lieshout, Arnim; Medd, Jonathan

    2011-01-01

    Your One-Stop Reference for VMware vSphere Automation If you manage vSphere in a Windows environment, automating routine tasks can save you time and increase efficiency. VMware vSphere PowerCLI is a set of pre-built commands based on Windows PowerShell that is designed to help you automate vSphere processes involving virtual machines, datacenters, storage, networks, and more. This detailed guide-using a practical, task-based approach and real-world examples-shows you how to get the most out of PowerCLI's handy cmdlets. Learn how to: Automate vCenter Server and ESX/ESX(i) Server deployment and

  5. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  6. An automated approach for annual layer counting in ice cores

    DEFF Research Database (Denmark)

    Winstrup, Mai; Svensson, A. M.; Rasmussen, S. O.

    2012-01-01

    A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of Hidden Markov Models (HMMs), which originally was developed for use in machine speech-recognition. The strength of the layer...

  7. An automated approach for annual layer counting in ice cores

    DEFF Research Database (Denmark)

    Winstrup, Mai; Svensson, A. M.; Rasmussen, S. O.

    2012-01-01

    A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of hidden Markov models (HMMs), which originally was developed for use in machine speech recognition. The strength of the layer...

  8. Automating quantum experiment control

    Science.gov (United States)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  9. An integrated billing application to streamline clinician workflow.

    Science.gov (United States)

    Vawdrey, David K; Walsh, Colin; Stetson, Peter D

    2014-01-01

    Between 2008 and 2010, our academic medical center transitioned to electronic provider documentation using a commercial electronic health record system. For attending physicians, one of the most frustrating aspects of this experience was the system's failure to support their existing electronic billing workflow. Because of poor system integration, it was difficult to verify the supporting documentation for each bill and impractical to track whether billable notes had corresponding charges. We developed and deployed in 2011 an integrated billing application called "iCharge" that streamlines clinicians' documentation and billing workflow, and simultaneously populates the inpatient problem list using billing diagnosis codes. Each month, over 550 physicians use iCharge to submit approximately 23,000 professional service charges for over 4,200 patients. On average, about 2.5 new problems are added to each patient's problem list. This paper describes the challenges and benefits of workflow integration across disparate applications and presents an example of innovative software development within a commercial EHR framework.

  10. Human-centered automation of testing, surveillance and maintenance

    International Nuclear Information System (INIS)

    Bhatt, S.C.; Sun, B.K.H.

    1991-01-01

    Manual surveillance and testing of instrumentation, control and protection systems at nuclear power plants involves system and human errors which can lead to substantial plant down time. Frequent manual testing can also contribute significantly to operation and maintenance cost. Automation technology offers potential for prudent applications at the power plant to reduce testing errors and cost. To help address the testing problems and to harness the benefit of automation application, input from utilities is obtained on suitable automation approaches. This paper includes lessens from successful past experience at a few plants where some island of automation exist. The results are summarized as a set of specifications for semi automatic testing. A human-centered automation methodology is proposed with the guidelines for optimal human/computer division of tasks given. Implementation obstacles for significant changes of testing practices are identified and methods acceptable to nuclear power plants for addressing these obstacles have been suggested

  11. Evaluation of Automated Flagger Assistance Devices

    Science.gov (United States)

    2018-02-01

    Automated flagger assistance devices (AFADs) are designed to improve worker safety by replacing flaggers who are typically located near traffic approaching a work zone. In this study, a new AFAD developed by the Missouri Department of Transportation ...

  12. Architecture Views Illustrating the Service Automation Aspect of SOA

    Science.gov (United States)

    Gu, Qing; Cuadrado, Félix; Lago, Patricia; Duenãs, Juan C.

    Earlier in this book, Chapter 8 provided a detailed analysis of service engineering, including a review of service engineering techniques and methodologies. This chapter is closely related to Chapter 8 as shows how such approaches can be used to develop a service, with particular emphasis on the identification of three views (the automation decision view, degree of service automation view and service automation related data view) that structure and ease elicitation and documentation of stakeholders' concerns. This is carried out through two large case studies to learn the industrial needs in illustrating services deployment and configuration automation. This set of views adds to the more traditional notations like UML, the visual power of attracting the attention of their users to the addressed concerns, and assist them in their work. This is especially crucial in service oriented architecting where service automation is highly demanded.

  13. Development of a framework of human-centered automation for the nuclear industry

    International Nuclear Information System (INIS)

    Nelson, W.R.; Haney, L.N.

    1993-01-01

    Introduction of automated systems into control rooms for advanced reactor designs is often justified on the basis of increased efficiency and reliability, without a detailed assessment of how the new technologies will influence the role of the operator. Such a ''technology-centered'' approach carries with it the risk that entirely new mechanisms for human error will be introduced, resulting in some unpleasant surprises when the plant goes into operation. The aviation industry has experienced some of these surprises since the introduction of automated systems into the cockpits of advanced technology aircraft. Pilot errors have actually been induced by automated systems, especially when the pilot doesn't fully understand what the automated systems are doing during all modes of operation. In order to structure the research program for investigating these problems, the National Aeronautics and Space Administration (NASA) has developed a framework for human-centered automation. This framework is described in the NASA document Human-Centered Aircraft Automation Philosophy by Charles Billings. It is the thesis of this paper that a corresponding framework of human-centered automation should be developed for the nuclear industry. Such a framework would serve to guide the design and regulation of automated systems for advanced reactor designs, and would help prevent some of the problems that have arisen in other applications that have followed a ''technology-centered'' approach

  14. CFD Prediction on the Pressure Distribution and Streamlines around an Isolated Single-Storey House Considering the Effect of Topographic Characteristics

    Science.gov (United States)

    Abdullah, J.; Zaini, S. S.; Aziz, M. S. A.; Majid, T. A.; Deraman, S. N. C.; Yahya, W. N. W.

    2018-04-01

    Single-storey houses are classified as low rise building and vulnerable to damages under windstorm event. This study was carried out with the aim to investigate the pressure distribution and streamlines around an isolated house by considering the effect of terrain characteristics. The topographic features such as flat, depression, ridge, and valley, are considered in this study. This simulation were analysed with Ansys FLUENT 14.0 software package. The result showed the topography characteristics influence the value of pressure coefficient and streamlines especially when the house was located at ridge terrain. The findings strongly suggested that wind analysis should include all topographic features in the analysis in order to establish the true wind force exerted on any structure.

  15. Improving and streamlining the workflow in the graphic arts and printing industry

    Science.gov (United States)

    Tuijn, Chris

    2003-01-01

    In order to survive in the economy of today, an ever-increasing productivity is required from all the partners participating in a specific business process. This is not different for the printing industry. One of the ways to remain profitable is, on one hand, to reduce costs by automation and aiming for large-scale projects and, on the other hand, to specialize and become an expert in the area in which one is active. One of the ways to realize these goals is by streamlining the communication of the different partners and focus on the core business. If we look at the graphic arts and printing industry, we can identify different important players that eventually help in the realization of printed material. For the printing company (as is the case for any other company), the most important player is the customer. This role can be adopted by many different players including publishers, companies, non-commercial institutions, private persons etc. Sometimes, the customer will be the content provider as well but this is not always the case. Often, the content is provided by other organizations such as design and prepress agencies, advertising companies etc. In most printing organizations, the customer has one contact person often referred to as the CSR (Customers Service Representative). Other people involved at the printing organization include the sales representatives, prepress operators, printing operators, postpress operators, planners, the logistics department, the financial department etc. In the first part of this article, we propose a solution that will improve the communication between all the different actors in the graphic arts and printing industry considerably and will optimize and streamline the overall workflow as well. This solution consists of an environment in which the customer can communicate with the CSR to ask for a quote based on a specific product intent; the CSR will then (after the approval from the customer's side) organize the work and brief

  16. Social aspects of automation: Some critical insights

    Science.gov (United States)

    Nouzil, Ibrahim; Raza, Ali; Pervaiz, Salman

    2017-09-01

    Sustainable development has been recognized globally as one of the major driving forces towards the current technological innovations. To achieve sustainable development and attain its associated goals, it is very important to properly address its concerns in different aspects of technological innovations. Several industrial sectors have enjoyed productivity and economic gains due to advent of automation technology. It is important to characterize sustainability for the automation technology. Sustainability is key factor that will determine the future of our neighbours in time and it must be tightly wrapped around the double-edged sword of technology. In this study, different impacts of automation have been addressed using the ‘Circles of Sustainability’ approach as a framework, covering economic, political, cultural and ecological aspects and their implications. A systematic literature review of automation technology from its inception is outlined and plotted against its many outcomes covering a broad spectrum. The study is more focused towards the social aspects of the automation technology. The study also reviews literature to analyse the employment deficiency as one end of the social impact spectrum. On the other end of the spectrum, benefits to society through technological advancements, such as the Internet of Things (IoT) coupled with automation are presented.

  17. Library Automation in Sub Saharan Africa: Case Study of the University of Botswana

    Science.gov (United States)

    Mutula, Stephen Mudogo

    2012-01-01

    Purpose: This article aims to present experiences and the lessons learned from the University of Botswana (UB) library automation project. The implications of the project for similar libraries planning automation in sub Saharan Africa and beyond are adduced. Design/methodology/approach: The article is a case study of library automation at the…

  18. Automated approach to detecting behavioral states using EEG-DABS

    Directory of Open Access Journals (Sweden)

    Zachary B. Loris

    2017-07-01

    Full Text Available Electrocorticographic (ECoG signals represent cortical electrical dipoles generated by synchronous local field potentials that result from simultaneous firing of neurons at distinct frequencies (brain waves. Since different brain waves correlate to different behavioral states, ECoG signals presents a novel strategy to detect complex behaviors. We developed a program, EEG Detection Analysis for Behavioral States (EEG-DABS that advances Fast Fourier Transforms through ECoG signals time series, separating it into (user defined frequency bands and normalizes them to reduce variability. EEG-DABS determines events if segments of an experimental ECoG record have significantly different power bands than a selected control pattern of EEG. Events are identified at every epoch and frequency band and then are displayed as output graphs by the program. Certain patterns of events correspond to specific behaviors. Once a predetermined pattern was selected for a behavioral state, EEG-DABS correctly identified the desired behavioral event. The selection of frequency band combinations for detection of the behavior affects accuracy of the method. All instances of certain behaviors, such as freezing, were correctly identified from the event patterns generated with EEG-DABS. Detecting behaviors is typically achieved by visually discerning unique animal phenotypes, a process that is time consuming, unreliable, and subjective. EEG-DABS removes variability by using defined parameters of EEG/ECoG for a desired behavior over chronic recordings. EEG-DABS presents a simple and automated approach to quantify different behavioral states from ECoG signals.

  19. Zephyr: A secure Internet-based process to streamline engineering procurements using the World Wide Web

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, C.W.; Cavitt, R.E.; Niven, W.A.; Warren, F.E.; Taylor, S.S.; Sharick, T.M.; Vickers, D.L.; Mitschkowetz, N.; Weaver, R.L.

    1996-08-13

    Lawrence Livermore National Laboratory (LLNL) is piloting an Internet- based paperless process called `Zephyr` to streamline engineering procurements. Major benefits have accrued by using Zephyr in reducing procurement time, speeding the engineering development cycle, facilitating industrial collaboration, and reducing overall costs. Programs at LLNL are benefiting by the efficiencies introduced since implementing Zephyr`s engineering and commerce on the Internet.

  20. Robust Preconditioning Estimates for Convection-Dominated Elliptic Problems via a Streamline Poincaré--Friedrichs Inequality

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe; Karátson, J.; Kovács, B.

    2014-01-01

    Roč. 52, č. 6 (2014), s. 2957-2976 ISSN 0036-1429 R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:68145535 Keywords : streamline diffusion finite element method * solving convection-dominated elliptic problems * convergence is robust Subject RIV: BA - General Mathematics Impact factor: 1.788, year: 2014 http://epubs.siam.org/doi/abs/10.1137/130940268

  1. A Machine Learning Approach to Automated Gait Analysis for the Noldus Catwalk System.

    Science.gov (United States)

    Frohlich, Holger; Claes, Kasper; De Wolf, Catherine; Van Damme, Xavier; Michel, Anne

    2018-05-01

    Gait analysis of animal disease models can provide valuable insights into in vivo compound effects and thus help in preclinical drug development. The purpose of this paper is to establish a computational gait analysis approach for the Noldus Catwalk system, in which footprints are automatically captured and stored. We present a - to our knowledge - first machine learning based approach for the Catwalk system, which comprises a step decomposition, definition and extraction of meaningful features, multivariate step sequence alignment, feature selection, and training of different classifiers (gradient boosting machine, random forest, and elastic net). Using animal-wise leave-one-out cross validation we demonstrate that with our method we can reliable separate movement patterns of a putative Parkinson's disease animal model and several control groups. Furthermore, we show that we can predict the time point after and the type of different brain lesions and can even forecast the brain region, where the intervention was applied. We provide an in-depth analysis of the features involved into our classifiers via statistical techniques for model interpretation. A machine learning method for automated analysis of data from the Noldus Catwalk system was established. Our works shows the ability of machine learning to discriminate pharmacologically relevant animal groups based on their walking behavior in a multivariate manner. Further interesting aspects of the approach include the ability to learn from past experiments, improve with more data arriving and to make predictions for single animals in future studies.

  2. THE QUESTION OF DEVELOPMENT OF AUTOMATED SYSTEMS FOR TRAFFIC MANAGEMENT

    Directory of Open Access Journals (Sweden)

    V. Shirin

    2015-12-01

    Full Text Available The current systems and methods for automated traffic management in cities are analyzed. The management in cities is analyzed. The management levels are specified. There were fermulated the general requirements, objectives and funnctions of the automated sistems for traffic management with regard to the modern transport problems as well as proposed their aditional managemrnt and infor-maton functions. A phased approach to the implementation of projects on creation of automated sys-tems of traffic management is offered.

  3. Software complex AS (automation of spectrometry). User interface of experiment automation system implementation

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Bogdzel', A.A.; Butorin, P.E.; Vasilovskij, S.G.; Gundorin, N.A.; Zlokazov, V.B.; Kutuzov, S.A.; Salamatin, I.M.; Shvetsov, V.N.

    2003-01-01

    An instrumental software complex for automation of spectrometry (AS) that enables prompt realization of experiment automation systems for spectrometers, which use data buferisation, has been developed. In the development new methods of programming and building of automation systems together with novel net technologies were employed. It is suggested that programs to schedule and conduct experiments should be based on the parametric model of the spectrometer, the approach that will make it possible to write programs suitable for any FLNP (Frank Laboratory of Neutron Physics) spectrometer and experimental technique applied and use different hardware interfaces for introducing the spectrometric data into the data acquisition system. The article describes the possibilities provided to the user in the field of scheduling and control of the experiment, data viewing, and control of the spectrometer parameters. The possibility of presenting the current spectrometer state, programs and the experimental data in the Internet in the form of dynamically formed protocols and graphs, as well as of the experiment control via the Internet is realized. To use the means of the Internet on the side of the client, applied programs are not needed. It suffices to know how to use the two programs to carry out experiments in the automated mode. The package is designed for experiments in condensed matter and nuclear physics and is ready for using. (author)

  4. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    Energy Technology Data Exchange (ETDEWEB)

    Acciarri, R.; Bagby, L.; Baller, B.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Greenlee, H.; James, C.; Jostlein, H.; Ketchum, W.; Kirby, M.; Kobilarcik, T.; Lockwitz, S.; Lundberg, B.; Marchionni, A.; Moore, C.D.; Palamara, O.; Pavlovic, Z.; Raaf, J.L.; Schukraft, A.; Snider, E.L.; Spentzouris, P.; Strauss, T.; Toups, M.; Wolbers, S.; Yang, T.; Zeller, G.P. [Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States); Adams, C. [Harvard University, Cambridge, MA (United States); Yale University, New Haven, CT (United States); An, R.; Littlejohn, B.R.; Martinez Caicedo, D.A. [Illinois Institute of Technology (IIT), Chicago, IL (United States); Anthony, J.; Escudero Sanchez, L.; De Vries, J.J.; Marshall, J.; Smith, A.; Thomson, M. [University of Cambridge, Cambridge (United Kingdom); Asaadi, J. [University of Texas, Arlington, TX (United States); Auger, M.; Ereditato, A.; Goeldi, D.; Kreslo, I.; Lorca, D.; Luethi, M.; Rudolf von Rohr, C.; Sinclair, J.; Weber, M. [Universitaet Bern, Bern (Switzerland); Balasubramanian, S.; Fleming, B.T.; Gramellini, E.; Hackenburg, A.; Luo, X.; Russell, B.; Tufanli, S. [Yale University, New Haven, CT (United States); Barnes, C.; Mousseau, J.; Spitz, J. [University of Michigan, Ann Arbor, MI (United States); Barr, G.; Bass, M.; Del Tutto, M.; Laube, A.; Soleti, S.R.; De Pontseele, W.V. [University of Oxford, Oxford (United Kingdom); Bay, F. [TUBITAK Space Technologies Research Institute, Ankara (Turkey); Bishai, M.; Chen, H.; Joshi, J.; Kirby, B.; Li, Y.; Mooney, M.; Qian, X.; Viren, B.; Zhang, C. [Brookhaven National Laboratory (BNL), Upton, NY (United States); Blake, A.; Devitt, D.; Lister, A.; Nowak, J. [Lancaster University, Lancaster (United Kingdom); Bolton, T.; Horton-Smith, G.; Meddage, V.; Rafique, A. [Kansas State University (KSU), Manhattan, KS (United States); Camilleri, L.; Caratelli, D.; Crespo-Anadon, J.I.; Fadeeva, A.A.; Genty, V.; Kaleko, D.; Seligman, W.; Shaevitz, M.H. [Columbia University, New York, NY (United States); Church, E. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Cianci, D.; Karagiorgi, G. [Columbia University, New York, NY (United States); The University of Manchester (United Kingdom); Cohen, E.; Piasetzky, E. [Tel Aviv University, Tel Aviv (Israel); Collin, G.H.; Conrad, J.M.; Hen, O.; Hourlier, A.; Moon, J.; Wongjirad, T.; Yates, L. [Massachusetts Institute of Technology (MIT), Cambridge, MA (United States); Convery, M.; Eberly, B.; Rochester, L.; Tsai, Y.T.; Usher, T. [SLAC National Accelerator Laboratory, Menlo Park, CA (United States); Dytman, S.; Graf, N.; Jiang, L.; Naples, D.; Paolone, V.; Wickremasinghe, D.A. [University of Pittsburgh, Pittsburgh, PA (United States); Esquivel, J.; Hamilton, P.; Pulliam, G.; Soderberg, M. [Syracuse University, Syracuse, NY (United States); Foreman, W.; Ho, J.; Schmitz, D.W.; Zennamo, J. [University of Chicago, IL (United States); Furmanski, A.P.; Garcia-Gamez, D.; Hewes, J.; Hill, C.; Murrells, R.; Porzio, D.; Soeldner-Rembold, S.; Szelc, A.M. [The University of Manchester (United Kingdom); Garvey, G.T.; Huang, E.C.; Louis, W.C.; Mills, G.B.; De Water, R.G.V. [Los Alamos National Laboratory (LANL), Los Alamos, NM (United States); Gollapinni, S. [Kansas State University (KSU), Manhattan, KS (United States); University of Tennessee, Knoxville, TN (United States); and others

    2018-01-15

    The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies. (orig.)

  5. The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector

    Science.gov (United States)

    Acciarri, R.; Adams, C.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.

    2018-01-01

    The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.

  6. Analysis of Real Time Technical Data Obtained While Shotcreting: An Approach Towards Automation

    OpenAIRE

    Rodríguez, Ángel; Río, Olga

    2010-01-01

    Automation of shotcreting process is a key factor in both improving the working conditions and increasing productivity; as well as in increasing the quality of shotcrete. The confidence in the quality of the automation process itself and shotcrete linings can be improved by real time monitoring of pumping as well as other shotcreting machine related parameters. Prediction of how the difIerent technical parameters of application are governing the whole process is being a subject of increasing ...

  7. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  8. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  9. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  10. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    International Nuclear Information System (INIS)

    Andersson, J.

    2011-01-01

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operators mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  11. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    Nadinic, B.; Vanjak, Z.

    2004-01-01

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  12. Brownfields Assessing Contractor Capabilities for Streamlined Site Investigation: Additional Information Regarding All Appropriate Inquiries and Hiring an Environmental Professional

    Science.gov (United States)

    This document assists Brownfields grantees and other decision makers as they assess the capabilities of contractors and consultants to determine their qualifications to provide streamlined and innovative strategies for the assessment and and cleanup.

  13. DEF: an automated dead-end filling approach based on quasi-endosymbiosis.

    Science.gov (United States)

    Liu, Lili; Zhang, Zijun; Sheng, Taotao; Chen, Ming

    2017-02-01

    Gap filling for the reconstruction of metabolic networks is to restore the connectivity of metabolites via finding high-confidence reactions that could be missed in target organism. Current methods for gap filling either fall into the network topology or have limited capability in finding missing reactions that are indirectly related to dead-end metabolites but of biological importance to the target model. We present an automated dead-end filling (DEF) approach, which is derived from the wisdom of endosymbiosis theory, to fill gaps by finding the most efficient dead-end utilization paths in a constructed quasi-endosymbiosis model. The recalls of reactions and dead ends of DEF reach around 73% and 86%, respectively. This method is capable of finding indirectly dead-end-related reactions with biological importance for the target organism and is applicable to any given metabolic model. In the E. coli iJR904 model, for instance, about 42% of the dead-end metabolites were fixed by our proposed method. DEF is publicly available at http://bis.zju.edu.cn/DEF/. mchen@zju.edu.cn Supplementary data are available at Bioinformatics online.

  14. Original Approach for Automated Quantification of Antinuclear Autoantibodies by Indirect Immunofluorescence

    Directory of Open Access Journals (Sweden)

    Daniel Bertin

    2013-01-01

    Full Text Available Introduction. Indirect immunofluorescence (IIF is the gold standard method for the detection of antinuclear antibodies (ANA which are essential markers for the diagnosis of systemic autoimmune rheumatic diseases. For the discrimination of positive and negative samples, we propose here an original approach named Immunofluorescence for Computed Antinuclear antibody Rational Evaluation (ICARE based on the calculation of a fluorescence index (FI. Methods. We made comparison between FI and visual evaluations on 237 consecutive samples and on a cohort of 25 patients with SLE. Results. We obtained very good technical performance of FI (95% sensitivity, 98% specificity, and a kappa of 0.92, even in a subgroup of weakly positive samples. A significant correlation between quantification of FI and IIF ANA titers was found (Spearman's ρ=0.80, P<0.0001. Clinical performance of ICARE was validated on a cohort of patients with SLE corroborating the fact that FI could represent an attractive alternative for the evaluation of antibody titer. Conclusion. Our results represent a major step for automated quantification of IIF ANA, opening attractive perspectives such as rapid sample screening and laboratory standardization.

  15. Levels of automation and user control - evaluation of a turbine automation interface

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Jonas (Chalmers Univ. of Technology (Sweden))

    2008-10-15

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (author)

  16. Levels of automation and user control - evaluation of a turbine automation interface

    International Nuclear Information System (INIS)

    Andersson, Jonas

    2008-10-01

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (au)

  17. OCT-based profiler for automating ocular surface prosthetic fitting (Conference Presentation)

    Science.gov (United States)

    Mujat, Mircea; Patel, Ankit H.; Maguluri, Gopi N.; Iftimia, Nicusor V.; Patel, Chirag; Agranat, Josh; Tomashevskaya, Olga; Bonte, Eugene; Ferguson, R. Daniel

    2016-03-01

    The use of a Prosthetic Replacement of the Ocular Surface Environment (PROSE) device is a revolutionary treatment for military patients that have lost their eyelids due to 3rd degree facial burns and for civilians who suffer from a host of corneal diseases. However, custom manual fitting is often a protracted painful, inexact process that requires multiple fitting sessions. Training for new practitioners is a long process. Automated methods to measure the complete corneal and scleral topology would provide a valuable tool for both clinicians and PROSE device manufacturers and would help streamline the fitting process. PSI has developed an ocular anterior-segment profiler based on Optical Coherence Tomography (OCT), which provides a 3D measure of the surface of the sclera and cornea. This device will provide topography data that will be used to expedite and improve the fabrication process for PROSE devices. OCT has been used to image portions of the cornea and sclera and to measure surface topology for smaller contact lenses [1-3]. However, current state-of-the-art anterior eye OCT systems can only scan about 16 mm of the eye's anterior surface, which is not sufficient for covering the sclera around the cornea. In addition, there is no systematic method for scanning and aligning/stitching the full scleral/corneal surface and commercial segmentation software is not optimized for the PROSE application. Although preliminary, our results demonstrate the capability of PSI's approach to generate accurate surface plots over relatively large areas of the eye, which is not currently possible with any other existing platform. Testing the technology on human volunteers is currently underway at Boston Foundation for Sight.

  18. Psychological distress and streamlined BreastScreen follow-up assessment versus standard assessment.

    Science.gov (United States)

    Sherman, Kerry A; Winch, Caleb J; Borecky, Natacha; Boyages, John

    2013-11-04

    To establish whether altered protocol characteristics of streamlined StepDown breast assessment clinics heightened or reduced the psychological distress of women in attendance compared with standard assessment. Willingness to attend future screening was also compared between the assessment groups. Observational, prospective study of women attending either a mammogram-only StepDown or a standard breast assessment clinic. Women completed questionnaires on the day of assessment and 1 month later. Women attending StepDown (136 women) or standard assessment clinics (148 women) at a BreastScreen centre between 10 November 2009 and 7 August 2010. Breast cancer worries; positive and negative psychological consequences of assessment (Psychological Consequences Questionnaire); breast cancer-related intrusion and avoidance (Impact of Event Scale); and willingness to attend, and uneasiness about, future screening. At 1-month follow-up, no group differences were evident between those attending standard and StepDown clinics on breast cancer worries (P= 0.44), positive (P= 0.88) and negative (P = 0.65) consequences, intrusion (P = 0.64), and avoidance (P = 0.87). Willingness to return for future mammograms was high, and did not differ between groups (P = 0.16), although higher levels of unease were associated with lessened willingness to rescreen (P = 0.04). There was no evidence that attending streamlined StepDown assessments had different outcomes in terms of distress than attending standard assessment clinics for women with a BreastScreen-detected abnormality. However, unease about attending future screening was generally associated with less willingness to do so in both groups; thus, there is a role for psycho-educational intervention to address these concerns.

  19. Part-task training in the context of automation: current and future directions.

    Science.gov (United States)

    Gutzwiller, Robert S; Clegg, Benjamin A; Blitch, John G

    2013-01-01

    Automation often elicits a divide-and-conquer outlook. By definition, automation has been suggested to assume control over a part or whole task that was previously performed by a human (Parasuraman & Riley, 1997). When such notions of automation are taken as grounds for training, they readily invoke a part-task training (PTT) approach. This article outlines broad functions of automation as a source of PTT and reviews the PTT literature, focusing on the potential benefits and costs related to using automation as a mechanism for PTT. The article reviews some past work in this area and suggests a path to move beyond the type of work captured by the "automation as PTT" framework. An illustrative experiment shows how automation in training and PTT are actually separable issues. PTT with automation has some utility but ultimately remains an unsatisfactory framework for the future broad potential of automation during training, and we suggest that a new conceptualization is needed.

  20. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  1. Streamline processing of discrete nuclear spectra by means of authoregularized iteration process (the KOLOBOK code)

    International Nuclear Information System (INIS)

    Gadzhokov, V.; Penev, I.; Aleksandrov, L.

    1979-01-01

    A brief description of the KOLOBOK computer code designed for streamline processing of discrete nuclear spectra with symmetric Gaussian shape of the single line on computers of the ES series, models 1020 and above, is given. The program solves the stream of discrete-spectrometry generated nonlinear problems by means of authoregularized iteration process. The Fortran-4 text of the code is reported in an Appendix

  2. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  3. An automated approach for annual layer counting in ice cores

    Directory of Open Access Journals (Sweden)

    M. Winstrup

    2012-11-01

    Full Text Available A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of hidden Markov models (HMMs, which originally was developed for use in machine speech recognition. The strength of the layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on statistical criteria for annual layer identification. The most likely positions of multiple layer boundaries in a section of ice core data are determined simultaneously, and a probabilistic uncertainty estimate of the resulting layer count is provided, ensuring an objective treatment of ambiguous layers in the data. Furthermore, multiple data series can be incorporated and used simultaneously. In this study, the automated layer counting algorithm has been applied to two ice core records from Greenland: one displaying a distinct annual signal and one which is more challenging. The algorithm shows high skill in reproducing the results from manual layer counts, and the resulting timescale compares well to absolute-dated volcanic marker horizons where these exist.

  4. Automated Guided Vehicle For Phsically Handicapped People - A Cost Effective Approach

    Science.gov (United States)

    Kumar, G. Arun, Dr.; Sivasubramaniam, Mr. A.

    2017-12-01

    Automated Guided vehicle (AGV) is like a robot that can deliver the materials from the supply area to the technician automatically. This is faster and more efficient. The robot can be accessed wirelessly. A technician can directly control the robot to deliver the components rather than control it via a human operator (over phone, computer etc. who has to program the robot or ask a delivery person to make the delivery). The vehicle is automatically guided through its ways. To avoid collisions a proximity sensor is attached to the system. The sensor senses the signals of the obstacles and can stop the vehicle in the presence of obstacles. Thus vehicle can avoid accidents that can be very useful to the present industrial trend and material handling and equipment handling will be automated and easy time saving methodology.

  5. An Automated Approach to Reasoning Under Multiple Perspectives

    Science.gov (United States)

    deBessonet, Cary

    2004-01-01

    This is the final report with emphasis on research during the last term. The context for the research has been the development of an automated reasoning technology for use in SMS (symbolic Manipulation System), a system used to build and query knowledge bases (KBs) using a special knowledge representation language SL (Symbolic Language). SMS interpreters assertive SL input and enters the results as components of its universe. The system operates in two basic models: 1) constructive mode (for building KBs); and 2) query/search mode (for querying KBs). Query satisfaction consists of matching query components with KB components. The system allows "penumbral matches," that is, matches that do not exactly meet the specifications of the query, but which are deemed relevant for the conversational context. If the user wants to know whether SMS has information that holds, say, for "any chow," the scope of relevancy might be set so that the system would respond based on a finding that it has information that holds for "most dogs," although this is not exactly what was called for by the query. The response would be qualified accordingly, as would normally be the case in ordinary human conversation. The general goal of the research was to develop an approach by which assertive content could be interpreted from multiple perspectives so that reasoning operations could be successfully conducted over the results. The interpretation of an SL statement such as, "{person believes [captain (asserted (perhaps)) (astronaut saw (comet (bright)))]}," which in English would amount to asserting something to the effect that, "Some person believes that a captain perhaps asserted that an astronaut saw a bright comet," would require the recognition of multiple perspectives, including some that are: a) epistemically-based (focusing on "believes"); b) assertion-based (focusing on "asserted"); c) perception-based (focusing on "saw"); d) adjectivally-based (focusing on "bight"); and e) modally

  6. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  7. State Models to Incentivize and Streamline Small Hydropower Development

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, Taylor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Levine, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Johnson, Kurt [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-10-31

    In 2016, the hydropower fleet in the United States produced more than 6 percent (approximately 265,829 gigawatt-hours [GWh]) of the total net electricity generation. The median-size hydroelectric facility in the United States is 1.6 MW and 75 percent of total facilities have a nameplate capacity of 10 MW or less. Moreover, the U.S. Department of Energy's Hydropower Vision study identified approximately 79 GW hydroelectric potential beyond what is already developed. Much of the potential identified is at low-impact new stream-reaches, existing conduits, and non-powered dams with a median project size of 10 MW or less. To optimize the potential and value of small hydropower development, state governments are crafting policies that provide financial assistance and expedite state and federal review processes for small hydroelectric projects. This report analyzes state-led initiatives and programs that incentivize and streamline small hydroelectric development.

  8. NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.

    Science.gov (United States)

    Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J

    2018-03-14

    Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.

  9. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    Science.gov (United States)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  10. Automated Endmember Selection for Nonlinear Unmixing of Lunar Spectra

    Science.gov (United States)

    Felder, M. P.; Grumpe, A.; Wöhler, C.; Mall, U.

    2013-09-01

    An important aspect of the analysis of remotely sensed lunar reflectance spectra is their decomposition into intimately mixed constituents. While some methods rely on unmixing of the observed reflectance spectra [1] or on the identification of minerals by extracting the depths and positions of mineral-specific absorption troughs [2, 3], these approaches do not allow for an automated selection of the (a priori unknown) endmembers from a large set of possible constituents. In this study, a non-linear spectral unmixing approach combined with an automated endmember selection scheme is proposed. This method is applied to reflectance spectra of the SIR-2 point spectrometer [4] carried by the Chandrayaan-1 spacecraft.

  11. Plant automation-application to SBWR project

    International Nuclear Information System (INIS)

    Rodriguez Rodriguez, C.

    1995-01-01

    In accordance with the requirements set out in the URD (Utility Requirements Document) issued by the EPRI (Electrical Power Research Institute), the design of new reactors, whether evolutionary or passive, shall taken into account the systematic automation of functions relating to normal plant operation. The objectives established are to: =2E Simplify operator-performed tasks =2E Reduce the risk of operator-error by considering human factors in the allocation of tasks =2E Improve man-machine reliability =2E Increase the availability of the plant In previous designs, automation has only been considered from the point of view of compliance with regulatory requirements for safety-related systems, or in isolated cases, as a method of protecting the investment where there is a risk of damage to main equipment. The use of digital technology has prevented the systematic pursuit of such objectives in the design of automated systems for processes associated with normal plant operation (startup, load follow, normal shutdown, etc) from being excessively complex and therefore costly to undertake. This paper describes how the automation of the aforementioned normal plant operation activities has been approached in General Electric's SBWR (Simplified Boiling Water Reactor) design. (Author)

  12. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Keyser, John

    2013-01-01

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation

  13. Automated NMR fragment based screening identified a novel interface blocker to the LARG/RhoA complex.

    Directory of Open Access Journals (Sweden)

    Jia Gao

    Full Text Available The small GTPase cycles between the inactive GDP form and the activated GTP form, catalyzed by the upstream guanine exchange factors. The modulation of such process by small molecules has been proven to be a fruitful route for therapeutic intervention to prevent the over-activation of the small GTPase. The fragment based approach emerging in the past decade has demonstrated its paramount potential in the discovery of inhibitors targeting such novel and challenging protein-protein interactions. The details regarding the procedure of NMR fragment screening from scratch have been rarely disclosed comprehensively, thus restricts its wider applications. To achieve a consistent screening applicable to a number of targets, we developed a highly automated protocol to cover every aspect of NMR fragment screening as possible, including the construction of small but diverse libray, determination of the aqueous solubility by NMR, grouping compounds with mutual dispersity to a cocktail, and the automated processing and visualization of the ligand based screening spectra. We exemplified our streamlined screening in RhoA alone and the complex of the small GTPase RhoA and its upstream guanine exchange factor LARG. Two hits were confirmed from the primary screening in cocktail and secondary screening over individual hits for LARG/RhoA complex, while one of them was also identified from the screening for RhoA alone. HSQC titration of the two hits over RhoA and LARG alone, respectively, identified one compound binding to RhoA.GDP at a 0.11 mM affinity, and perturbed the residues at the switch II region of RhoA. This hit blocked the formation of the LARG/RhoA complex, validated by the native gel electrophoresis, and the titration of RhoA to ¹⁵N labeled LARG in the absence and presence the compound, respectively. It therefore provides us a starting point toward a more potent inhibitor to RhoA activation catalyzed by LARG.

  14. VISMASHUP: streamlining the creation of custom visualization applications

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [Los Alamos National Laboratory; Santos, Emanuele [UNIV OF UTAH; Lins, Lauro [UNIV OF UTAH; Freire, Juliana [UNIV OF UTAH; Silva, Cl' audio T [UNIV OF UTAH

    2010-01-01

    Visualization is essential for understanding the increasing volumes of digital data. However, the process required to create insightful visualizations is involved and time consuming. Although several visualization tools are available, including tools with sophisticated visual interfaces, they are out of reach for users who have little or no knowledge of visualization techniques and/or who do not have programming expertise. In this paper, we propose VISMASHUP, a new framework for streamlining the creation of customized visualization applications. Because these applications can be customized for very specific tasks, they can hide much of the complexity in a visualization specification and make it easier for users to explore visualizations by manipulating a small set of parameters. We describe the framework and how it supports the various tasks a designer needs to carry out to develop an application, from mining and exploring a set of visualization specifications (pipelines), to the creation of simplified views of the pipelines, and the automatic generation of the application and its interface. We also describe the implementation of the system and demonstrate its use in two real application scenarios.

  15. An Accelerated Testing Approach for Automated Vehicles with Background Traffic Described by Joint Distributions

    OpenAIRE

    Huang, Zhiyuan; Lam, Henry; Zhao, Ding

    2017-01-01

    This paper proposes a new framework based on joint statistical models for evaluating risks of automated vehicles in a naturalistic driving environment. The previous studies on the Accelerated Evaluation for automated vehicles are extended from multi-independent-variate models to joint statistics. The proposed toolkit includes exploration of the rare event (e.g. crash) sets and construction of accelerated distributions for Gaussian Mixture models using Importance Sampling techniques. Furthermo...

  16. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  17. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  18. Status of automated nuclear scanning systems

    International Nuclear Information System (INIS)

    Gold, R.; Roberts, J.H.; Preston, C.C.; McNeece, J.P.; Ruddy, F.H.

    1983-07-01

    Present day minicomputers and microprocessors enable a range of automation, from partial to total, of tasks once thought beyond approach. The status of three computer controlled systems for quantitative track measurements is reviewed. Two systems, the Hanford optical track scanner (HOTS) and an automated scanning electron microscope (ASEM) are used for scanning solid state track recorders (SSTR). The third systems, the emulsion scanning processor (ESP), is an interactive system used to measure the length of proton tracks in nuclear research emulsions (NRE). Current limitations of these systems for quantitative track scanning are presented. Experimental uncertainties attained with these computer controlled systems are described using results obtained from reactor neutron dosimetry

  19. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. (Chalmers Univ. of Technology. Division Design and Human factors. Dept. of Product and Production Development, Goeteborg (Sweden))

    2011-01-15

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operator's mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  20. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  1. A Systems Approach to Information Technology (IT) Infrastructure Design for Utility Management Automation Systems

    OpenAIRE

    A. Fereidunian; H. Lesani; C. Lucas; M. Lehtonen; M. M. Nordman

    2006-01-01

    Almost all of electric utility companies are planning to improve their management automation system, in order to meet the changing requirements of new liberalized energy market and to benefit from the innovations in information and communication technology (ICT or IT). Architectural design of the utility management automation (UMA) systems for their IT-enabling requires proper selection of IT choices for UMA system, which leads to multi-criteria decision-makings (MCDM). In resp...

  2. Results of a multivariate approach to automated oestrus and mastitis detection

    NARCIS (Netherlands)

    Mol, de R.M.; Kroeze, G.H.; Achten, J.M.F.H.; Maatje, K.; Rossing, W.

    1997-01-01

    In modern dairy farming sensors can be used to measure on-line milk yield, milk temperature, electrical conductivity of quarter milk, concentrate intake and the cow's activity. Together with information from the management information system (MIS), the sensor data can be used for the automated

  3. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    Directory of Open Access Journals (Sweden)

    Mutlu Ozdogan

    Full Text Available In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i creating masks for water, non-forested areas, clouds, and cloud shadows; ii identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR difference image; iii filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission, issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for

  4. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Jay W. Grate; Timothy A. DeVol

    2006-01-01

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  5. CCD characterization and measurements automation

    International Nuclear Information System (INIS)

    Kotov, I.V.; Frank, J.; Kotov, A.I.; Kubanek, P.; O'Connor, P.; Prouza, M.; Radeka, V.; Takacs, P.

    2012-01-01

    Modern mosaic cameras have grown both in size and in number of sensors. The required volume of sensor testing and characterization has grown accordingly. For camera projects as large as the LSST, test automation becomes a necessity. A CCD testing and characterization laboratory was built and is in operation for the LSST project. Characterization of LSST study contract sensors has been performed. The characterization process and its automation are discussed, and results are presented. Our system automatically acquires images, populates a database with metadata information, and runs express analysis. This approach is illustrated on 55 Fe data analysis. 55 Fe data are used to measure gain, charge transfer efficiency and charge diffusion. Examples of express analysis results are presented and discussed.

  6. Proposed Model for a Streamlined, Cohesive, and Optimized K-12 STEM Curriculum with a Focus on Engineering

    Science.gov (United States)

    Locke, Edward

    2009-01-01

    This article presents a proposed model for a clear description of K-12 age-possible engineering knowledge content, in terms of the selection of analytic principles and predictive skills for various grades, based on the mastery of mathematics and science pre-requisites, as mandated by national or state performance standards; and a streamlined,…

  7. Microdiversification in genome-streamlined ubiquitous freshwater Actinobacteria.

    Science.gov (United States)

    Neuenschwander, Stefan M; Ghai, Rohit; Pernthaler, Jakob; Salcher, Michaela M

    2018-01-01

    Actinobacteria of the acI lineage are the most abundant microbes in freshwater systems, but there are so far no pure living cultures of these organisms, possibly because of metabolic dependencies on other microbes. This, in turn, has hampered an in-depth assessment of the genomic basis for their success in the environment. Here we present genomes from 16 axenic cultures of acI Actinobacteria. The isolates were not only of minute cell size, but also among the most streamlined free-living microbes, with extremely small genome sizes (1.2-1.4 Mbp) and low genomic GC content. Genome reduction in these bacteria might have led to auxotrophy for various vitamins, amino acids and reduced sulphur sources, thus creating dependencies to co-occurring organisms (the 'Black Queen' hypothesis). Genome analyses, moreover, revealed a surprising degree of inter- and intraspecific diversity in metabolic pathways, especially of carbohydrate transport and metabolism, and mainly encoded in genomic islands. The striking genotype microdiversification of acI Actinobacteria might explain their global success in highly dynamic freshwater environments with complex seasonal patterns of allochthonous and autochthonous carbon sources. We propose a new order within Actinobacteria ('Candidatus Nanopelagicales') with two new genera ('Candidatus Nanopelagicus' and 'Candidatus Planktophila') and nine new species.

  8. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Franke, Daniel; Kikhney, Alexey G.; Svergun, Dmitri I.

    2012-01-01

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  9. System analysis of automated speed enforcement implementation.

    Science.gov (United States)

    2016-04-01

    Speeding is a major factor in a large proportion of traffic crashes, injuries, and fatalities in the United States. Automated Speed Enforcement (ASE) is one of many approaches shown to be effective in reducing speeding violations and crashes. However...

  10. On the engineering design for systematic integration of agent-orientation in industrial automation.

    Science.gov (United States)

    Yu, Liyong; Schüller, Andreas; Epple, Ulrich

    2014-09-01

    In today's automation industry, agent-oriented development of system functionalities appears to have a great potential for increasing autonomy and flexibility of complex operations, while lowering the workload of users. In this paper, we present a reference model for the harmonious and systematical integration of agent-orientation in industrial automation. Considering compatibility with existing automation systems and best practice, this model combines advantages of function block technology, service orientation and native description methods from the automation standard IEC 61131-3. This approach can be applied as a guideline for the engineering design of future agent-oriented automation systems. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Streamlined Total Synthesis of Trioxacarcins and Its Application to the Design, Synthesis, and Biological Evaluation of Analogues Thereof. Discovery of Simpler Designed and Potent Trioxacarcin Analogues.

    Science.gov (United States)

    Nicolaou, K C; Chen, Pengxi; Zhu, Shugao; Cai, Quan; Erande, Rohan D; Li, Ruofan; Sun, Hongbao; Pulukuri, Kiran Kumar; Rigol, Stephan; Aujay, Monette; Sandoval, Joseph; Gavrilyuk, Julia

    2017-11-01

    A streamlined total synthesis of the naturally occurring antitumor agents trioxacarcins is described, along with its application to the construction of a series of designed analogues of these complex natural products. Biological evaluation of the synthesized compounds revealed a number of highly potent, and yet structurally simpler, compounds that are effective against certain cancer cell lines, including a drug-resistant line. A novel one-step synthesis of anthraquinones and chloro anthraquinones from simple ketone precursors and phenylselenyl chloride is also described. The reported work, featuring novel chemistry and cascade reactions, has potential applications in cancer therapy, including targeted approaches as in antibody-drug conjugates.

  12. Rapid prototyping of an automated video surveillance system: a hardware-software co-design approach

    Science.gov (United States)

    Ngo, Hau T.; Rakvic, Ryan N.; Broussard, Randy P.; Ives, Robert W.

    2011-06-01

    FPGA devices with embedded DSP and memory blocks, and high-speed interfaces are ideal for real-time video processing applications. In this work, a hardware-software co-design approach is proposed to effectively utilize FPGA features for a prototype of an automated video surveillance system. Time-critical steps of the video surveillance algorithm are designed and implemented in the FPGAs logic elements to maximize parallel processing. Other non timecritical tasks are achieved by executing a high level language program on an embedded Nios-II processor. Pre-tested and verified video and interface functions from a standard video framework are utilized to significantly reduce development and verification time. Custom and parallel processing modules are integrated into the video processing chain by Altera's Avalon Streaming video protocol. Other data control interfaces are achieved by connecting hardware controllers to a Nios-II processor using Altera's Avalon Memory Mapped protocol.

  13. RBioplot: an easy-to-use R pipeline for automated statistical analysis and data visualization in molecular biology and biochemistry

    Directory of Open Access Journals (Sweden)

    Jing Zhang

    2016-09-01

    Full Text Available Background Statistical analysis and data visualization are two crucial aspects in molecular biology and biology. For analyses that compare one dependent variable between standard (e.g., control and one or multiple independent variables, a comprehensive yet highly streamlined solution is valuable. The computer programming language R is a popular platform for researchers to develop tools that are tailored specifically for their research needs. Here we present an R package RBioplot that takes raw input data for automated statistical analysis and plotting, highly compatible with various molecular biology and biochemistry lab techniques, such as, but not limited to, western blotting, PCR, and enzyme activity assays. Method The package is built based on workflows operating on a simple raw data layout, with minimum user input or data manipulation required. The package is distributed through GitHub, which can be easily installed through one single-line R command. A detailed installation guide is available at http://kenstoreylab.com/?page_id=2448. Users can also download demo datasets from the same website. Results and Discussion By integrating selected functions from existing statistical and data visualization packages with extensive customization, RBioplot features both statistical analysis and data visualization functionalities. Key properties of RBioplot include: -Fully automated and comprehensive statistical analysis, including normality test, equal variance test, Student’s t-test and ANOVA (with post-hoc tests; -Fully automated histogram, heatmap and joint-point curve plotting modules; -Detailed output files for statistical analysis, data manipulation and high quality graphs; -Axis range finding and user customizable tick settings; -High user-customizability.

  14. Are the new automated methods for bone age estimation advantageous over the manual approaches?

    Science.gov (United States)

    De Sanctis, Vincenzo; Soliman, Ashraf T; Di Maio, Salvatore; Bedair, Said

    2014-12-01

    Bone Age Assessment (BAA) is performed worldwide for the evaluation of endocrine, genetic and chronic diseases, to monitor response to medical therapy and to determine the growth potential of children and adolescents. It is also used for consultation in planning orthopedic procedures, for determination of chronological age for adopted children, youth sports participation and in forensic settings. The main clinical methods for skeletal bone age estimation are the Greulich and Pyle (GP) and the Tanner and Whitehouse (TW) methods. Seventy six per cent (76%) of radiologists or pediatricians usually use the method of GP, 20% that of TW and 4% other methods. The advantages of using the TW method, as opposed to the GP method, are that it overcomes the subjectivity problem and results are more reproducible. However, it is complex and time consuming; for this reason its usage is just about 20% on a world-wide scale. Moreover, there are some evidences that bone age assignments by different physicians can differ significantly. Computerized and Quantitative Ultrasound Technologies (QUS) for assessing skeletal maturity have been developed with the aim of reducing many of the inconsistencies associated with radiographic investigations. In spite of the fact that the volume of automated methods for BAA has increased, the majotity of them are still in an early phase of development. QUS is comparable to the GP based method, but there is not enough established data yet for the healthy population. The Authors wish to stimulate the attention on the accuracy, reliability and consistency of BAA and to initiate a debate on manual versus automated approaches to enhance our assessment for skeletal matutation in children and adolescents.

  15. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  16. Automated vocabulary discovery for geo-parsing online epidemic intelligence.

    Science.gov (United States)

    Keller, Mikaela; Freifeld, Clark C; Brownstein, John S

    2009-11-24

    Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  17. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 330: Areas 6, 22, and 23 Tanks and Spill Sites, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    T. M. Fitzmaurice

    2001-08-01

    This Streamlined Approach for Environmental restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 330, Areas 6,22, and 23 Tanks and Spill Sites. The CAUs are currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO). This CAU is located at the Nevada Test Site (NTS) (Figure 1). CAU 330 consists of the following Corrective Action Sites (CASs): (1) CAS 06-02-04 - Consists of an underground tank and piping. This CAS is close to an area that was part of the Animal Investigation Program (AIP), conducted under the U.S. Public Health Service. Its purpose was to study and perform tests on the cattle and wild animals in and around the NTS that were exposed to radionuclides. It is unknown if this tank was part of these operations. (2) CAS 22-99-06 - Is a fuel spill that is believed to be a waste oil release which occurred when Camp Desert Rock was an active facility. This CAS was originally identified as being a small depression where liquids were poured onto the ground, located on the west side of Building T-1001. This building has been identified as housing a fire station, radio station, and radio net remote and telephone switchboard. (3) CAS 23-01-02 - Is a large aboveground storage tank (AST) farm that was constructed to provide gasoline and diesel storage in Area 23. The site consists of two ASTs, a concrete foundation, a surrounding earthen berm, associated piping, and unloading stations. (4) CAS 23-25-05 - Consists of an asphalt oil spill/tar release that contains a wash covered with asphalt oil/tar material, a half buried 208-liter (L) (55-gallon [gal]) drum, rebar, and concrete located in the vicinity.

  18. A novel approach to sequence validating protein expression clones with automated decision making

    Directory of Open Access Journals (Sweden)

    Mohr Stephanie E

    2007-06-01

    Full Text Available Abstract Background Whereas the molecular assembly of protein expression clones is readily automated and routinely accomplished in high throughput, sequence verification of these clones is still largely performed manually, an arduous and time consuming process. The ultimate goal of validation is to determine if a given plasmid clone matches its reference sequence sufficiently to be "acceptable" for use in protein expression experiments. Given the accelerating increase in availability of tens of thousands of unverified clones, there is a strong demand for rapid, efficient and accurate software that automates clone validation. Results We have developed an Automated Clone Evaluation (ACE system – the first comprehensive, multi-platform, web-based plasmid sequence verification software package. ACE automates the clone verification process by defining each clone sequence as a list of multidimensional discrepancy objects, each describing a difference between the clone and its expected sequence including the resulting polypeptide consequences. To evaluate clones automatically, this list can be compared against user acceptance criteria that specify the allowable number of discrepancies of each type. This strategy allows users to re-evaluate the same set of clones against different acceptance criteria as needed for use in other experiments. ACE manages the entire sequence validation process including contig management, identifying and annotating discrepancies, determining if discrepancies correspond to polymorphisms and clone finishing. Designed to manage thousands of clones simultaneously, ACE maintains a relational database to store information about clones at various completion stages, project processing parameters and acceptance criteria. In a direct comparison, the automated analysis by ACE took less time and was more accurate than a manual analysis of a 93 gene clone set. Conclusion ACE was designed to facilitate high throughput clone sequence

  19. Brownfields Assessing Contractor Capabilities for Streamlined Site Investigations -- Additional Information Regarding All Appropriate Inquiries and Hiring an Environmental Professional (November 2006)

    Science.gov (United States)

    Guidance for Brownfields grantees and other decision makers to assess the capabilities of contractors and consultants to determine their qualifications to provide streamlined and innovative strategies for the assessment and cleanup of brownfields.

  20. Theoretical Calculations on Sediment Transport on Titan, and the Possible Production of Streamlined Forms

    Science.gov (United States)

    Burr, D. M.; Emery, J. P.; Lorenz, R. D.

    2005-01-01

    The Cassini Imaging Science System (ISS) has been returning images of Titan, along with other Saturnian satellites. Images taken through the 938 nm methane window see down to Titan's surface. One of the purposes of the Cassini mission is to investigate possible fluid cycling on Titan. Lemniscate features shown recently and radar evidence of surface flow prompted us to consider theoretically the creation by methane fluid flow of streamlined forms on Titan. This follows work by other groups in theoretical consideration of fluid motion on Titan's surface.

  1. Automated Functional Testing based on the Navigation of Web Applications

    Directory of Open Access Journals (Sweden)

    Boni García

    2011-08-01

    Full Text Available Web applications are becoming more and more complex. Testing such applications is an intricate hard and time-consuming activity. Therefore, testing is often poorly performed or skipped by practitioners. Test automation can help to avoid this situation. Hence, this paper presents a novel approach to perform automated software testing for web applications based on its navigation. On the one hand, web navigation is the process of traversing a web application using a browser. On the other hand, functional requirements are actions that an application must do. Therefore, the evaluation of the correct navigation of web applications results in the assessment of the specified functional requirements. The proposed method to perform the automation is done in four levels: test case generation, test data derivation, test case execution, and test case reporting. This method is driven by three kinds of inputs: i UML models; ii Selenium scripts; iii XML files. We have implemented our approach in an open-source testing framework named Automatic Testing Platform. The validation of this work has been carried out by means of a case study, in which the target is a real invoice management system developed using a model-driven approach.

  2. Automated quantification of aligned collagen for human breast carcinoma prognosis

    Directory of Open Access Journals (Sweden)

    Jeremy S Bredfeldt

    2014-01-01

    Full Text Available Background: Mortality in cancer patients is directly attributable to the ability of cancer cells to metastasize to distant sites from the primary tumor. This migration of tumor cells begins with a remodeling of the local tumor microenvironment, including changes to the extracellular matrix and the recruitment of stromal cells, both of which facilitate invasion of tumor cells into the bloodstream. In breast cancer, it has been proposed that the alignment of collagen fibers surrounding tumor epithelial cells can serve as a quantitative image-based biomarker for survival of invasive ductal carcinoma patients. Specific types of collagen alignment have been identified for their prognostic value and now these tumor associated collagen signatures (TACS are central to several clinical specimen imaging trials. Here, we implement the semi-automated acquisition and analysis of this TACS candidate biomarker and demonstrate a protocol that will allow consistent scoring to be performed throughout large patient cohorts. Methods: Using large field of view high resolution microscopy techniques, image processing and supervised learning methods, we are able to quantify and score features of collagen fiber alignment with respect to adjacent tumor-stromal boundaries. Results: Our semi-automated technique produced scores that have statistically significant correlation with scores generated by a panel of three human observers. In addition, our system generated classification scores that accurately predicted survival in a cohort of 196 breast cancer patients. Feature rank analysis reveals that TACS positive fibers are more well-aligned with each other, are of generally lower density, and terminate within or near groups of epithelial cells at larger angles of interaction. Conclusion: These results demonstrate the utility of a supervised learning protocol for streamlining the analysis of collagen alignment with respect to tumor stromal boundaries.

  3. ASUPT Automated Objective Performance Measurement System.

    Science.gov (United States)

    Waag, Wayne L.; And Others

    To realize its full research potential, a need exists for the development of an automated objective pilot performance evaluation system for use in the Advanced Simulation in Undergraduate Pilot Training (ASUPT) facility. The present report documents the approach taken for the development of performance measures and also presents data collected…

  4. Formal Test Automation: A Simple Experiment

    NARCIS (Netherlands)

    Belinfante, Axel; Feenstra, J.; de Vries, R.G.; Tretmans, G.J.; Goga, N.; Feijs, Loe; Mauw, Sjouke; Heerink, A.W.; Csopaki, Gyula; Dibuz, Sarolta; Tarnay, Katalin

    1999-01-01

    In this paper1 we study the automation of test derivation and execution in the area of conformance testing. The test scenarios are derived from multiple specication languages: LOTOS, Promela and SDL. A central theme of this study is the usability of batch-oriented and on-the-fly testing approaches.

  5. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained....... The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  6. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  7. Towards an Automated Acoustic Detection System for Free Ranging Elephants.

    Science.gov (United States)

    Zeppelzauer, Matthias; Hensman, Sean; Stoeger, Angela S

    The human-elephant conflict is one of the most serious conservation problems in Asia and Africa today. The involuntary confrontation of humans and elephants claims the lives of many animals and humans every year. A promising approach to alleviate this conflict is the development of an acoustic early warning system. Such a system requires the robust automated detection of elephant vocalizations under unconstrained field conditions. Today, no system exists that fulfills these requirements. In this paper, we present a method for the automated detection of elephant vocalizations that is robust to the diverse noise sources present in the field. We evaluate the method on a dataset recorded under natural field conditions to simulate a real-world scenario. The proposed method outperformed existing approaches and robustly and accurately detected elephants. It thus can form the basis for a future automated early warning system for elephants. Furthermore, the method may be a useful tool for scientists in bioacoustics for the study of wildlife recordings.

  8. Innovation of the Process of Inventorying of the Selected Transport Units: Case Study in the Automotive Industry

    Directory of Open Access Journals (Sweden)

    Chocholáč Jan

    2017-05-01

    Full Text Available In the current situation, highly competitive time when the emphasis is on shortening delivery schedules, streamlining the production cycle and, ultimately, reducing the total cost businesses are forced to optimizations and innovations. The article deals with the inventory of the selected transport units in warehouses. The inventory is currently being carried out through manual labor of employees. This paper proposes a possible implementation of new and innovative approach to inventory control, with the help of an automated inventory realized by the drones.

  9. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  10. Containerless automated processing of intermetallic compounds and composites

    Science.gov (United States)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  11. An Automated Approach to Syntax-based Analysis of Classical Latin

    Directory of Open Access Journals (Sweden)

    Anjalie Field

    2016-12-01

    Full Text Available The goal of this study is to present an automated method for analyzing the style of Latin authors. Many of the common automated methods in stylistic analysis are based on lexical measures, which do not work well with Latin because of the language’s high degree of inflection and free word order. In contrast, this study focuses on analysis at a syntax level by examining two constructions, the ablative absolute and the cum clause. These constructions are often interchangeable, which suggests an author’s choice of construction is typically more stylistic than functional. We first identified these constructions in hand-annotated texts. Next we developed a method for identifying the constructions in unannotated texts, using probabilistic morphological tagging. Our methods identified constructions with enough accuracy to distinguish among different genres and different authors. In particular, we were able to determine which book of Caesar’s Commentarii de Bello Gallico was not written by Caesar. Furthermore, the usage of ablative absolutes and cum clauses observed in this study is consistent with the usage scholars have observed when analyzing these texts by hand. The proposed methods for an automatic syntax-based analysis are shown to be valuable for the study of classical literature.

  12. Summary of astronaut inputs on automation and robotics for Space Station Freedom

    Science.gov (United States)

    Weeks, David J.

    1990-01-01

    Astronauts and payload specialists present specific recommendations in the form of an overview that relate to the use of automation and robotics on the Space Station Freedom. The inputs are based on on-orbit operations experience, time requirements for crews, and similar crew-specific knowledge that address the impacts of automation and robotics on productivity. Interview techniques and specific questionnaire results are listed, and the majority of the responses indicate that incorporating automation and robotics to some extent and with human backup can improve productivity. Specific support is found for the use of advanced automation and EVA robotics on the Space Station Freedom and for the use of advanced automation on ground-based stations. Ground-based control of in-flight robotics is required, and Space Station activities and crew tasks should be analyzed to assess the systems engineering approach for incorporating automation and robotics.

  13. An Extended Case Study Methoology for Investigating Influence of Cultural, Organizational, and Automation Factors on Human-Automation Trust

    Science.gov (United States)

    Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  14. An approach for automated analysis of particle holograms

    Science.gov (United States)

    Stanton, A. C.; Caulfield, H. J.; Stewart, G. W.

    1984-01-01

    A simple method for analyzing droplet holograms is proposed that is readily adaptable to automation using modern image digitizers and analyzers for determination of the number, location, and size distributions of spherical or nearly spherical droplets. The method determines these parameters by finding the spatial location of best focus of the droplet images. With this location known, the particle size may be determined by direct measurement of image area in the focal plane. Particle velocity and trajectory may be determined by comparison of image locations at different instants in time. The method is tested by analyzing digitized images from a reconstructed in-line hologram, and the results show that the method is more accurate than a time-consuming plane-by-plane search for sharpest focus.

  15. Development and application of the automated Monte Carlo biasing procedure in SAS4

    International Nuclear Information System (INIS)

    Tang, J.S.; Broadhead, B.L.

    1993-01-01

    An automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete-ordinates calculation are used to generate biasing parameters for a three-dimensional Monte Carlo calculation. The automated procedure consisting of cross-section processing, adjoint flux determination, biasing parameter generation, and the initiation of a MORSE-SGC/S Monte Carlo calculation has been implemented in the SAS4 module of the SCALE computer code system. The automated procedure has been used extensively in the investigation of both computational and experimental benchmarks for the NEACRP working group on shielding assessment of transportation packages. The results of these studies indicate that with the automated biasing procedure, Monte Carlo shielding calculations of spent fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost. The systematic biasing approach described in this paper can also be applied to other similar shielding problems

  16. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    failures and anomalies of avionic systems are also incorporated. The resultant model helps simulate the emergence of automation-related issues in today's modern airliners from a top-down, generalized approach, which serves as a platform to evaluate NASA developed technologies

  17. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  18. Human-centred automation programme: review of experiment related studies

    International Nuclear Information System (INIS)

    Grimstad, Tone; Andresen, Gisle; Skjerve, Ann Britt Miberg

    2000-04-01

    Twenty-three empirical studies concerning automation and performance have been reviewed. The purposes of the review are to support experimental studies in the Human-Centred Automation (HCA) programme and to develop a general theory on HCA. Each study was reviewed with regard to twelve study characteristics: domain, type of study, purpose, definition of automation, variables, theoretical basis, models of operator performance, methods applied, experimental design, outcome, stated scope of results, strengths and limitations. Seven of the studies involved domain experts, the rest used students as participants. The majority of the articles originated from the aviation domain: only the study conducted in HAMMLAB considered process control in power plants. In the experimental studies, the independent variable was level of automation, or reliability of automation, while the most common dependent variables were workload, situation awareness, complacency, trust, and criteria of performance, e.g., number of correct responses or response time. Although the studies highlight important aspects of human-automation interaction, it is still unclear how system performance is affected. Nevertheless, the fact that many factors seem to be involved is taken as support for the system-oriented approach of the HCA programme. In conclusion, the review provides valuable input both to the design of experiments and to the development of a general theory. (Author). refs

  19. Project chariot remediation - the use of DOE's observational approach for environmental restoration with elements of the new DOE safer approach

    International Nuclear Information System (INIS)

    Hopkins, A.; Stewart, C.; Cabble, K.

    1994-01-01

    The primary purpose of Project Chariot was to investigate the technical problems and assess the effect of the proposed harbor excavation using nuclear explosives in Alaska. However, no nuclear devices were brought to the Project Chariot site. Between 1959 and 1961 various environmental tests were conducted. During the course of these environmental studies, the U.S. Geological Survey (USGS) granted the use of up to 5 curies of radioactive material at the Chariot site in Cape Thompson, Alaska; however only 26 millicuries were ever actually used. The tests were conducted in 12 test plots which were later gathered together and were mixed with in situ-soils generating approximately 1,600 cubic feet of soil. This area was then covered with four feet of clean soil, creating a mound. In 1962, the site was abandoned. A researcher at the University of Alaska at Fairbanks obtained in formation regarding the tests conducted and the materials left at the Project Chariot site. In response to concerns raised through the publication of this information, it was decided by the Department of Energy (DOE) that total remediation of the mound be completed within the year. During the summer of 1993, IT Corporation carried out the assessment and remediation of the Project Chariot site using a streamlined approach to waste site decision making called the Observational Approach (OA), and added elements of the new DOE Streamlined Approach for Environmental Restoration (SAFER). This remediation and remediation approach is described

  20. Automated single-trial assessment of laser-evoked potentials as an objective functional diagnostic tool for the nociceptive system.

    Science.gov (United States)

    Hatem, S M; Hu, L; Ragé, M; Gierasimowicz, A; Plaghki, L; Bouhassira, D; Attal, N; Iannetti, G D; Mouraux, A

    2012-12-01

    To assess the clinical usefulness of an automated analysis of event-related potentials (ERPs). Nociceptive laser-evoked potentials (LEPs) and non-nociceptive somatosensory electrically-evoked potentials (SEPs) were recorded in 37 patients with syringomyelia and 21 controls. LEP and SEP peak amplitudes and latencies were estimated using a single-trial automated approach based on time-frequency wavelet filtering and multiple linear regression, as well as a conventional approach based on visual inspection. The amplitudes and latencies of normal and abnormal LEP and SEP peaks were identified reliably using both approaches, with similar sensitivity and specificity. Because the automated approach provided an unbiased solution to account for average waveforms where no ERP could be identified visually, it revealed significant differences between patients and controls that were not revealed using the visual approach. The automated analysis of ERPs characterized reliably and objectively LEP and SEP waveforms in patients. The automated single-trial analysis can be used to characterize normal and abnormal ERPs with a similar sensitivity and specificity as visual inspection. While this does not justify its use in a routine clinical setting, the technique could be useful to avoid observer-dependent biases in clinical research. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Interprofessional education and social interaction: The use of automated external defibrillators in team-based basic life support.

    Science.gov (United States)

    Onan, Arif; Simsek, Nurettin

    2017-04-01

    Automated external defibrillators are pervasive computing devices designed for the treatment and management of acute sudden cardiac arrest. This study aims to explain users' actual use behavior in teams formed by different professions taken after a short time span of interaction with automated external defibrillator. Before the intervention, all the participants were certified with the American Heart Association Basic Life Support for healthcare providers. A statistically significant difference was revealed in mean individual automated external defibrillator technical skills between uniprofessional and interprofessional groups. The technical automated external defibrillator team scores were greater for groups with interprofessional than for those with uniprofessional education. The nontechnical automated external defibrillator skills of interprofessional and uniprofessional teams revealed differences in advantage of interprofessional teams. Students positively accept automated external defibrillators if well-defined and validated training opportunities to use them expertly are available. Uniprofessional teams were successfully supported by their members and, thereby, used automated external defibrillator effectively. Furthermore, the interprofessional approach resulted in as much effective teamwork as the uniprofessional approach.

  2. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 398: Area 25 Spill Sites, Nevada Test Site, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    K. B. Campbell

    2001-01-01

    This Streamlined Approach for Environmental Restoration (SAFER) plan addresses the activities necessary to close Corrective Action Unit (CAU) 398: Area 25 Spill Sites. CAU 398, located in Area 25 of the Nevada Test Site, is currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO) (FFACO, 1996), and consists of the following 13 Corrective Action Sites (CASs) (Figure 1): (1) CAS 25-44-01 , a fuel spill on soil that covers a concrete pad. The origins and use of the spill material are unknown, but the spill is suspected to be railroad bedding material. (2) CAS 25-44-02, a spill of liquid to the soil from leaking drums. (3) CAS 25-44-03, a spill of oil from two leaking drums onto a concrete pad and surrounding soil. (4) CAS 25-44-04, a spill from two tanks containing sulfuric acid and sodium hydroxide used for a water demineralization process. (5) CAS 25-25-02, a fuel or oil spill from leaking drums that were removed in 1992. (6) CAS 25-25-03, an oil spill adjacent to a tipped-over drum. The source of the drum is not listed, although it is noted that the drum was removed in 1991. (7) CAS 25-25-04, an area on the north side of the Engine-Maintenance, Assembly, and Disassembly (E-MAD) facility, where oils and cooling fluids from metal machining operations were poured directly onto the ground. (8) CAS 25-25-05, an area of oil and/or hydraulic fluid spills beneath the heavy equipment once stored there. (9) CAS 25-25-06, an area of diesel fuel staining beneath two generators that have since been removed. (10) CAS 25-25-07, an area of hydraulic oil spills associated with a tunnel-boring machine abandoned inside X-Tunnel. (11) CAS 25-25-08, an area of hydraulic fluid spills associated with a tunnel-boring machine abandoned inside Y-Tunnel. (12) CAS 25-25-16, a diesel fuel spill from an above-ground storage tank located near Building 3320 at Engine Test Stand-1 (ETS-1) that was removed in 1998. (13) CAS 25-25-17, a hydraulic oil spill

  3. What's a Manager to Do about Office Automation?

    Science.gov (United States)

    Sherron, Gene

    1984-01-01

    Some observations about office technology in higher education are presented. University of Maryland plans concerning its approach to office automation are discussed. Seventeen features considered "mandatories" for any system that might be acquired are identified. (Author/MLW)

  4. Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning

    Science.gov (United States)

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…

  5. Automated-biasing approach to Monte Carlo shipping-cask calculations

    International Nuclear Information System (INIS)

    Hoffman, T.J.; Tang, J.S.; Parks, C.V.; Childs, R.L.

    1982-01-01

    Computer Sciences at Oak Ridge National Laboratory, under a contract with the Nuclear Regulatory Commission, has developed the SCALE system for performing standardized criticality, shielding, and heat transfer analyses of nuclear systems. During the early phase of shielding development in SCALE, it was established that Monte Carlo calculations of radiation levels exterior to a spent fuel shipping cask would be extremely expensive. This cost can be substantially reduced by proper biasing of the Monte Carlo histories. The purpose of this study is to develop and test an automated biasing procedure for the MORSE-SGC/S module of the SCALE system

  6. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972)

  7. Automated spectral and timing analysis of AGNs

    Science.gov (United States)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  8. Automated Procurement System (APS): Project management plan (DS-03), version 1.2

    Science.gov (United States)

    Murphy, Diane R.

    1994-01-01

    The National Aeronautics and Space Administration (NASA) Marshall Space Flight Center (MSFC) is implementing an Automated Procurement System (APS) to streamline its business activities that are used to procure goods and services. This Project Management Plan (PMP) is the governing document throughout the implementation process and is identified as the APS Project Management Plan (DS-03). At this point in time, the project plan includes the schedules and tasks necessary to proceed through implementation. Since the basis of APS is an existing COTS system, the implementation process is revised from the standard SDLC. The purpose of the PMP is to provide the framework for the implementation process. It discusses the roles and responsibilities of the NASA project staff, the functions to be performed by the APS Development Contractor (PAI), and the support required of the NASA computer support contractor (CSC). To be successful, these three organizations must work together as a team, working towards the goals established in this Project Plan. The Project Plan includes a description of the proposed system, describes the work to be done, establishes a schedule of deliverables, and discusses the major standards and procedures to be followed.

  9. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  10. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  11. Automated Test Case Generation

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  12. Streamlining of the RELAP5-3D Code

    International Nuclear Information System (INIS)

    Mesina, George L; Hykes, Joshua; Guillen, Donna Post

    2007-01-01

    RELAP5-3D is widely used by the nuclear community to simulate general thermal hydraulic systems and has proven to be so versatile that the spectrum of transient two-phase problems that can be analyzed has increased substantially over time. To accommodate the many new types of problems that are analyzed by RELAP5-3D, both the physics and numerical methods of the code have been continuously improved. In the area of computational methods and mathematical techniques, many upgrades and improvements have been made decrease code run time and increase solution accuracy. These include vectorization, parallelization, use of improved equation solvers for thermal hydraulics and neutron kinetics, and incorporation of improved library utilities. In the area of applied nuclear engineering, expanded capabilities include boron and level tracking models, radiation/conduction enclosure model, feedwater heater and compressor components, fluids and corresponding correlations for modeling Generation IV reactor designs, and coupling to computational fluid dynamics solvers. Ongoing and proposed future developments include improvements to the two-phase pump model, conversion to FORTRAN 90, and coupling to more computer programs. This paper summarizes the general improvements made to RELAP5-3D, with an emphasis on streamlining the code infrastructure for improved maintenance and development. With all these past, present and planned developments, it is necessary to modify the code infrastructure to incorporate modifications in a consistent and maintainable manner. Modifying a complex code such as RELAP5-3D to incorporate new models, upgrade numerics, and optimize existing code becomes more difficult as the code grows larger. The difficulty of this as well as the chance of introducing errors is significantly reduced when the code is structured. To streamline the code into a structured program, a commercial restructuring tool, FOR( ) STRUCT, was applied to the RELAP5-3D source files. The

  13. A linear programming approach for placement of applicants to academic programs

    OpenAIRE

    Kassa, Biniyam Asmare

    2013-01-01

    This paper reports a linear programming approach for placement of applicants to study programs developed and implemented at the college of Business & Economics, Bahir Dar University, Bahir Dar, Ethiopia. The approach is estimated to significantly streamline the placement decision process at the college by reducing required man hour as well as the time it takes to announce placement decisions. Compared to the previous manual system where only one or two placement criteria were considered, the ...

  14. Automated mitosis detection using texture, SIFT features and HMAX biologically inspired approach.

    Science.gov (United States)

    Irshad, Humayun; Jalali, Sepehr; Roux, Ludovic; Racoceanu, Daniel; Hwee, Lim Joo; Naour, Gilles Le; Capron, Frédérique

    2013-01-01

    According to Nottingham grading system, mitosis count in breast cancer histopathology is one of three components required for cancer grading and prognosis. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. The aim is to investigate the various texture features and Hierarchical Model and X (HMAX) biologically inspired approach for mitosis detection using machine-learning techniques. We propose an approach that assists pathologists in automated mitosis detection and counting. The proposed method, which is based on the most favorable texture features combination, examines the separability between different channels of color space. Blue-ratio channel provides more discriminative information for mitosis detection in histopathological images. Co-occurrence features, run-length features, and Scale-invariant feature transform (SIFT) features were extracted and used in the classification of mitosis. Finally, a classification is performed to put the candidate patch either in the mitosis class or in the non-mitosis class. Three different classifiers have been evaluated: Decision tree, linear kernel Support Vector Machine (SVM), and non-linear kernel SVM. We also evaluate the performance of the proposed framework using the modified biologically inspired model of HMAX and compare the results with other feature extraction methods such as dense SIFT. The proposed method has been tested on Mitosis detection in breast cancer histological images (MITOS) dataset provided for an International Conference on Pattern Recognition (ICPR) 2012 contest. The proposed framework achieved 76% recall, 75% precision and 76% F-measure. Different frameworks for classification have been evaluated for mitosis detection. In future work, instead of regions, we intend to compute features on the results of mitosis contour segmentation and use them to improve detection and classification rate.

  15. A systematic engineering tool chain approach for self-organizing building automation systems

    NARCIS (Netherlands)

    Mc Gibney, A.; Rea, S.; Lehmann, M.; Thior, S.; Lesecq, S.; Hendriks, M.; Guyon-Gardeux, C.; Mai, Linh Tuan; Pacull, F.; Ploennigs, J.; Basten, T.; Pesch, D.

    2013-01-01

    There is a strong push towards smart buildings that aim to achieve comfort, safety and energy efficiency, through building automation systems (BAS) that incorporate multiple subsystems such as heating and air-conditioning, lighting, access control etc. The design, commissioning and operation of BAS

  16. Automated audiometry using apple iOS-based application technology.

    Science.gov (United States)

    Foulad, Allen; Bui, Peggy; Djalilian, Hamid

    2013-11-01

    The aim of this study is to determine the feasibility of an Apple iOS-based automated hearing testing application and to compare its accuracy with conventional audiometry. Prospective diagnostic study. Setting Academic medical center. An iOS-based software application was developed to perform automated pure-tone hearing testing on the iPhone, iPod touch, and iPad. To assess for device variations and compatibility, preliminary work was performed to compare the standardized sound output (dB) of various Apple device and headset combinations. Forty-two subjects underwent automated iOS-based hearing testing in a sound booth, automated iOS-based hearing testing in a quiet room, and conventional manual audiometry. The maximum difference in sound intensity between various Apple device and headset combinations was 4 dB. On average, 96% (95% confidence interval [CI], 91%-100%) of the threshold values obtained using the automated test in a sound booth were within 10 dB of the corresponding threshold values obtained using conventional audiometry. When the automated test was performed in a quiet room, 94% (95% CI, 87%-100%) of the threshold values were within 10 dB of the threshold values obtained using conventional audiometry. Under standardized testing conditions, 90% of the subjects preferred iOS-based audiometry as opposed to conventional audiometry. Apple iOS-based devices provide a platform for automated air conduction audiometry without requiring extra equipment and yield hearing test results that approach those of conventional audiometry.

  17. A streamlined artificial variable free version of simplex method.

    Directory of Open Access Journals (Sweden)

    Syed Inayatullah

    Full Text Available This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.

  18. A streamlined artificial variable free version of simplex method.

    Science.gov (United States)

    Inayatullah, Syed; Touheed, Nasir; Imtiaz, Muhammad

    2015-01-01

    This paper proposes a streamlined form of simplex method which provides some great benefits over traditional simplex method. For instance, it does not need any kind of artificial variables or artificial constraints; it could start with any feasible or infeasible basis of an LP. This method follows the same pivoting sequence as of simplex phase 1 without showing any explicit description of artificial variables which also makes it space efficient. Later in this paper, a dual version of the new method has also been presented which provides a way to easily implement the phase 1 of traditional dual simplex method. For a problem having an initial basis which is both primal and dual infeasible, our methods provide full freedom to the user, that whether to start with primal artificial free version or dual artificial free version without making any reformulation to the LP structure. Last but not the least, it provides a teaching aid for the teachers who want to teach feasibility achievement as a separate topic before teaching optimality achievement.

  19. Automated delineation of stroke lesions using brain CT images

    Directory of Open Access Journals (Sweden)

    Céline R. Gillebert

    2014-01-01

    Full Text Available Computed tomographic (CT images are widely used for the identification of abnormal brain tissue following infarct and hemorrhage in stroke. Manual lesion delineation is currently the standard approach, but is both time-consuming and operator-dependent. To address these issues, we present a method that can automatically delineate infarct and hemorrhage in stroke CT images. The key elements of this method are the accurate normalization of CT images from stroke patients into template space and the subsequent voxelwise comparison with a group of control CT images for defining areas with hypo- or hyper-intense signals. Our validation, using simulated and actual lesions, shows that our approach is effective in reconstructing lesions resulting from both infarct and hemorrhage and yields lesion maps spatially consistent with those produced manually by expert operators. A limitation is that, relative to manual delineation, there is reduced sensitivity of the automated method in regions close to the ventricles and the brain contours. However, the automated method presents a number of benefits in terms of offering significant time savings and the elimination of the inter-operator differences inherent to manual tracing approaches. These factors are relevant for the creation of large-scale lesion databases for neuropsychological research. The automated delineation of stroke lesions from CT scans may also enable longitudinal studies to quantify changes in damaged tissue in an objective and reproducible manner.

  20. Automated vocabulary discovery for geo-parsing online epidemic intelligence

    Directory of Open Access Journals (Sweden)

    Freifeld Clark C

    2009-11-01

    Full Text Available Abstract Background Automated surveillance of the Internet provides a timely and sensitive method for alerting on global emerging infectious disease threats. HealthMap is part of a new generation of online systems designed to monitor and visualize, on a real-time basis, disease outbreak alerts as reported by online news media and public health sources. HealthMap is of specific interest for national and international public health organizations and international travelers. A particular task that makes such a surveillance useful is the automated discovery of the geographic references contained in the retrieved outbreak alerts. This task is sometimes referred to as "geo-parsing". A typical approach to geo-parsing would demand an expensive training corpus of alerts manually tagged by a human. Results Given that human readers perform this kind of task by using both their lexical and contextual knowledge, we developed an approach which relies on a relatively small expert-built gazetteer, thus limiting the need of human input, but focuses on learning the context in which geographic references appear. We show in a set of experiments, that this approach exhibits a substantial capacity to discover geographic locations outside of its initial lexicon. Conclusion The results of this analysis provide a framework for future automated global surveillance efforts that reduce manual input and improve timeliness of reporting.

  1. Automated MR morphometry to predict Alzheimer's disease in mild cognitive impairment

    International Nuclear Information System (INIS)

    Fritzsche, Klaus H.; Schlindwein, Sarah; Bruggen, Thomas van; Meinzer, Hans-Peter; Stieltjes, Bram; Essig, Marco

    2010-01-01

    Prediction of progression from mild cognitive impairment (MCI) to Alzheimer's disease (AD) is challenging but essential for early treatment. This study aims to investigate the use of hippocampal atrophy markers for the automatic detection of MCI converters and to compare the predictive value to manually obtained hippocampal volume and temporal horn width. A study was performed with 15 patients with Alzheimer and 18 patients with MCI (ten converted, eight remained stable in a 3-year follow-up) as well as 15 healthy subjects. MRI scans were obtained at baseline and evaluated with an automated system for scoring of hippocampal atrophy. The predictive value of the automated system was compared with manual measurements of hippocampal volume and temporal horn width in the same subjects. The conversion to AD was correctly predicted in 77.8% of the cases (sensitivity 70%, specificity 87.5%) in the MCI group using automated morphometry and a plain linear classifier that was trained on the AD and healthy groups. Classification was improved by limiting analysis to the left cerebral hemisphere (accuracy 83.3%, sensitivity 70%, specificity 100%). The manual linear and volumetric approaches reached rates of 66.7% (40/100%) and 72.2% (60/87.5%), respectively. The automatic approach fulfills many important preconditions for clinical application. Contrary to the manual approaches, it is not observer-dependent and reduces human resource requirements. Automated assessment may be useful for individual patient assessment and for predicting progression to dementia. (orig.)

  2. Streamlined Approach for Environmental Restoration Plan for Corrective Action Unit 330: Areas 6, 22, and 23 Tanks and Spill Sites, Nevada Test Site, Nevada; TOPICAL

    International Nuclear Information System (INIS)

    T. M. Fitzmaurice

    2001-01-01

    This Streamlined Approach for Environmental restoration (SAFER) plan addresses the action necessary for the closure of Corrective Action Unit (CAU) 330, Areas 6,22, and 23 Tanks and Spill Sites. The CAUs are currently listed in Appendix III of the Federal Facility Agreement and Consent Order (FFACO). This CAU is located at the Nevada Test Site (NTS) (Figure 1). CAU 330 consists of the following Corrective Action Sites (CASs): (1) CAS 06-02-04 - Consists of an underground tank and piping. This CAS is close to an area that was part of the Animal Investigation Program (AIP), conducted under the U.S. Public Health Service. Its purpose was to study and perform tests on the cattle and wild animals in and around the NTS that were exposed to radionuclides. It is unknown if this tank was part of these operations. (2) CAS 22-99-06 - Is a fuel spill that is believed to be a waste oil release which occurred when Camp Desert Rock was an active facility. This CAS was originally identified as being a small depression where liquids were poured onto the ground, located on the west side of Building T-1001. This building has been identified as housing a fire station, radio station, and radio net remote and telephone switchboard. (3) CAS 23-01-02 - Is a large aboveground storage tank (AST) farm that was constructed to provide gasoline and diesel storage in Area 23. The site consists of two ASTs, a concrete foundation, a surrounding earthen berm, associated piping, and unloading stations. (4) CAS 23-25-05 - Consists of an asphalt oil spill/tar release that contains a wash covered with asphalt oil/tar material, a half buried 208-liter (L) (55-gallon[gal]) drum, rebar, and concrete located in the vicinity

  3. Automated Classification of Asteroids into Families at Work

    Science.gov (United States)

    Knežević, Zoran; Milani, Andrea; Cellino, Alberto; Novaković, Bojan; Spoto, Federica; Paolicchi, Paolo

    2014-07-01

    We have recently proposed a new approach to the asteroid family classification by combining the classical HCM method with an automated procedure to add newly discovered members to existing families. This approach is specifically intended to cope with ever increasing asteroid data sets, and consists of several steps to segment the problem and handle the very large amount of data in an efficient and accurate manner. We briefly present all these steps and show the results from three subsequent updates making use of only the automated step of attributing the newly numbered asteroids to the known families. We describe the changes of the individual families membership, as well as the evolution of the classification due to the newly added intersections between the families, resolved candidate family mergers, and emergence of the new candidates for the mergers. We thus demonstrate how by the new approach the asteroid family classification becomes stable in general terms (converging towards a permanent list of confirmed families), and in the same time evolving in details (to account for the newly discovered asteroids) at each update.

  4. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  5. Application Filters for TCP/IP Industrial Automation Protocols

    Science.gov (United States)

    Batista, Aguinaldo B.; Kobayashi, Tiago H.; Medeiros, João Paulo S.; Brito, Agostinho M.; Motta Pires, Paulo S.

    The use of firewalls is a common approach usually meant to secure Automation Technology (AT) from Information Technology (TI) networks. This work proposes a filtering system for TCP/IP-based automation networks in which only certain kind of industrial traffic is permitted. All network traffic which does not conform with a proper industrial protocol pattern or with specific rules for its actions is supposed to be abnormal and must be blocked. As a case study, we developed a seventh layer firewall application with the ability of blocking spurious traffic, using an IP packet queueing engine and a regular expression library.

  6. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...... method. The method, however, does not fully automate the planning but make the planning process significantly fast. The results and discussion are presented and conclusion is given in the end....

  7. Holistic approach for automated background EEG assessment in asphyxiated full-term infants

    Science.gov (United States)

    Matic, Vladimir; Cherian, Perumpillichira J.; Koolen, Ninah; Naulaers, Gunnar; Swarte, Renate M.; Govaert, Paul; Van Huffel, Sabine; De Vos, Maarten

    2014-12-01

    Objective. To develop an automated algorithm to quantify background EEG abnormalities in full-term neonates with hypoxic ischemic encephalopathy. Approach. The algorithm classifies 1 h of continuous neonatal EEG (cEEG) into a mild, moderate or severe background abnormality grade. These classes are well established in the literature and a clinical neurophysiologist labeled 272 1 h cEEG epochs selected from 34 neonates. The algorithm is based on adaptive EEG segmentation and mapping of the segments into the so-called segments’ feature space. Three features are suggested and further processing is obtained using a discretized three-dimensional distribution of the segments’ features represented as a 3-way data tensor. Further classification has been achieved using recently developed tensor decomposition/classification methods that reduce the size of the model and extract a significant and discriminative set of features. Main results. Effective parameterization of cEEG data has been achieved resulting in high classification accuracy (89%) to grade background EEG abnormalities. Significance. For the first time, the algorithm for the background EEG assessment has been validated on an extensive dataset which contained major artifacts and epileptic seizures. The demonstrated high robustness, while processing real-case EEGs, suggests that the algorithm can be used as an assistive tool to monitor the severity of hypoxic insults in newborns.

  8. Challenges and Obstacles of e-Government Streamlining: A Case Study

    Directory of Open Access Journals (Sweden)

    Anupam K. Nath

    2014-05-01

    Full Text Available e-Government streamlining has been a challenge since its inception in the domain of e-business. Business organizations face challenges while trying to collaborate with partners through the use of information technology in order to ensure efficient delivery of services. One of the major reasons for these inefficient services has been political bureaucracies among government organizations. To meet this challenge, a transparent and networked environment is required where government organizations can effectively partner with other relevant organizations. Using a case study analysis, we intend to identify not just the challenges in government organizations while providing services which require collaborative effort, but also the obstacles in adopting new technology for collaboration. We believe that the outcome of our research could provide a generalized guideline for government agencies where there is need for digital collaboration. Our findings will thus help government organizations to address the challenges in digital collaboration, and also help them implement new technology successfully to ensure efficient delivery of services.

  9. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  10. An Automated Design Approach for High-Lift Systems incorporating Eccentric Beam Actuators

    NARCIS (Netherlands)

    Steenhuizen, D.; Van Tooren, M.J.L.

    2010-01-01

    In order to asess the merit of novel high-lift structural concepts to the design of contemporary and future transport aircraft, a highly automated design routine is elaborated. The structure, purpose and evolution of this design routine is set-out with the use of Knowledge-Based Engineering

  11. Automated Scheduling Via Artificial Intelligence

    Science.gov (United States)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  12. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  13. Crew aiding and automation: A system concept for terminal area operations, and guidelines for automation design

    Science.gov (United States)

    Dwyer, John P.

    1994-01-01

    This research and development program comprised two efforts: the development of guidelines for the design of automated systems, with particular emphasis on automation design that takes advantage of contextual information, and the concept-level design of a crew aiding system, the Terminal Area Navigation Decision Aiding Mediator (TANDAM). This concept outlines a system capable of organizing navigation and communication information and assisting the crew in executing the operations required in descent and approach. In service of this endeavor, problem definition activities were conducted that identified terminal area navigation and operational familiarization exercises addressing the terminal area navigation problem. Both airborne and ground-based (ATC) elements of aircraft control were extensively researched. The TANDAM system concept was then specified, and the crew interface and associated systems described. Additionally, three descent and approach scenarios were devised in order to illustrate the principal functions of the TANDAM system concept in relation to the crew, the aircraft, and ATC. A plan for the evaluation of the TANDAM system was established. The guidelines were developed based on reviews of relevant literature, and on experience gained in the design effort.

  14. Automated intelligent emergency assesment of GTA pipeline events

    Energy Technology Data Exchange (ETDEWEB)

    Asgary, Ali; Ghaffari, Alireza; Kong, Albert [University of York, Toronto, (Canada)

    2010-07-01

    The traditional approach used for risk assessment in pipeline operations is stochastic, using probabilities of events. This paper reports on an investigation into the deployment of an automated intelligence reasoning system used in decision support for risk assessments related to oil and gas emergencies in the Greater Toronto Area (GTA). The study evaluated the use of fuzzy interference rules encoded using JESS and fuzzy J to develop a risk assessment system. Real time data from web services such as weather, Geographic Information Systems (GIS) and Supervisory Control and Data Acquisition (SCADA) systems were used. This study took into consideration the most recent communications infrastructure and technologies, involving the most advanced human machine interface (HMI) access via hypertext transfer protocol (HTTP). This new approach will support decision making in emergency response scenarios. The study showed that the convergence of several technologies may change the automated intelligence system design paradigm.

  15. Damage Detection with Streamlined Structural Health Monitoring Data

    Directory of Open Access Journals (Sweden)

    Jian Li

    2015-04-01

    Full Text Available The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM systems often overwhelms the systems’ capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compression, interactive sensor data retrieval, and structural knowledge discovery, which aim to enhance the reliability, efficiency, and robustness of on-line SHM systems. Adoption of this new concept will enable the design of an on-line SHM system with more uniform data generation and data handling capacity for its subsystems. To examine this concept in the context of vibration-based SHM systems, real sensor data from an on-line SHM system comprising a scaled steel bridge structure and an on-line data acquisition system with remote data access was used in this study. Vibration test results clearly demonstrated the prominent performance characteristics of the proposed integrated SHM system including rapid data access, interactive data retrieval and knowledge discovery of structural conditions on a global level.

  16. Damage detection with streamlined structural health monitoring data.

    Science.gov (United States)

    Li, Jian; Deng, Jun; Xie, Weizhi

    2015-04-15

    The huge amounts of sensor data generated by large scale sensor networks in on-line structural health monitoring (SHM) systems often overwhelms the systems' capacity for data transmission and analysis. This paper presents a new concept for an integrated SHM system in which a streamlined data flow is used as a unifying thread to integrate the individual components of on-line SHM systems. Such an integrated SHM system has a few desirable functionalities including embedded sensor data compression, interactive sensor data retrieval, and structural knowledge discovery, which aim to enhance the reliability, efficiency, and robustness of on-line SHM systems. Adoption of this new concept will enable the design of an on-line SHM system with more uniform data generation and data handling capacity for its subsystems. To examine this concept in the context of vibration-based SHM systems, real sensor data from an on-line SHM system comprising a scaled steel bridge structure and an on-line data acquisition system with remote data access was used in this study. Vibration test results clearly demonstrated the prominent performance characteristics of the proposed integrated SHM system including rapid data access, interactive data retrieval and knowledge discovery of structural conditions on a global level.

  17. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  18. Computer-Automated Approach for Scoring Short Essays in an Introductory Statistics Course

    Science.gov (United States)

    Zimmerman, Whitney Alicia; Kang, Hyun Bin; Kim, Kyung; Gao, Mengzhao; Johnson, Glenn; Clariana, Roy; Zhang, Fan

    2018-01-01

    Over two semesters short essay prompts were developed for use with the Graphical Interface for Knowledge Structure (GIKS), an automated essay scoring system. Participants were students in an undergraduate-level online introductory statistics course. The GIKS compares students' writing samples with an expert's to produce keyword occurrence and…

  19. Automated scheduling and planning from theory to practice

    CERN Document Server

    Ozcan, Ender; Urquhart, Neil

    2013-01-01

      Solving scheduling problems has long presented a challenge for computer scientists and operations researchers. The field continues to expand as researchers and practitioners examine ever more challenging problems and develop automated methods capable of solving them. This book provides 11 case studies in automated scheduling, submitted by leading researchers from across the world. Each case study examines a challenging real-world problem by analysing the problem in detail before investigating how the problem may be solved using state of the art techniques.The areas covered include aircraft scheduling, microprocessor instruction scheduling, sports fixture scheduling, exam scheduling, personnel scheduling and production scheduling.  Problem solving methodologies covered include exact as well as (meta)heuristic approaches, such as local search techniques, linear programming, genetic algorithms and ant colony optimisation.The field of automated scheduling has the potential to impact many aspects of our lives...

  20. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  1. Defining the drivers for accepting decision making automation in air traffic management.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C; Williamson, Ann

    2011-04-01

    Air Traffic Management (ATM) operators are under increasing pressure to improve the efficiency of their operation to cater for forecasted increases in air traffic movements. One solution involves increasing the utilisation of automation within the ATM system. The success of this approach is contingent on Air Traffic Control Operators' (ATCOs) willingness to accept increased levels of automation. The main aim of the present research was to examine the drivers underpinning ATCOs' willingness to accept increased utilisation of automation within their role. Two fictitious scenarios involving the application of two new automated decision-making tools were created. The results of an online survey revealed traditional predictors of automation acceptance such as age, trust and job satisfaction explain between 4 and 7% of the variance. Furthermore, these predictors varied depending on the purpose in which the automation was to be employed. These results are discussed from an applied and theoretical perspective. STATEMENT OF RELEVANCE: Efficiency improvements in ATM are required to cater for forecasted increases in air traffic movements. One solution is to increase the utilisation of automation within Air Traffic Control. The present research examines the drivers underpinning air traffic controllers' willingness to accept increased levels of automation in their role.

  2. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  3. Influence of Cultural, Organizational, and Automation Capability on Human Automation Trust: A Case Study of Auto-GCAS Experimental Test Pilots

    Science.gov (United States)

    Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  4. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    International Nuclear Information System (INIS)

    Chess, Jordan J.; Montoya, Sergio A.; Harvey, Tyler R.; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E.; McMorran, Benjamin J.

    2017-01-01

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  5. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Energy Technology Data Exchange (ETDEWEB)

    Chess, Jordan J., E-mail: jchess@uoregon.edu [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Montoya, Sergio A. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); Harvey, Tyler R. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Ophus, Colin [National Center for Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); McMorran, Benjamin J. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States)

    2017-06-15

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  6. Prior Familiarization With Takeover Requests Affects Drivers' Takeover Performance and Automation Trust.

    Science.gov (United States)

    Hergeth, Sebastian; Lorenz, Lutz; Krems, Josef F

    2017-05-01

    The objective for this study was to investigate the effects of prior familiarization with takeover requests (TORs) during conditional automated driving on drivers' initial takeover performance and automation trust. System-initiated TORs are one of the biggest concerns for conditional automated driving and have been studied extensively in the past. Most, but not all, of these studies have included training sessions to familiarize participants with TORs. This makes them hard to compare and might obscure first-failure-like effects on takeover performance and automation trust formation. A driving simulator study compared drivers' takeover performance in two takeover situations across four prior familiarization groups (no familiarization, description, experience, description and experience) and automation trust before and after experiencing the system. As hypothesized, prior familiarization with TORs had a more positive effect on takeover performance in the first than in a subsequent takeover situation. In all groups, automation trust increased after participants experienced the system. Participants who were given no prior familiarization with TORs reported highest automation trust both before and after experiencing the system. The current results extend earlier findings suggesting that prior familiarization with TORs during conditional automated driving will be most relevant for takeover performance in the first takeover situation and that it lowers drivers' automation trust. Potential applications of this research include different approaches to familiarize users with automated driving systems, better integration of earlier findings, and sophistication of experimental designs.

  7. Automated mitosis detection using texture, SIFT features and HMAX biologically inspired approach

    Directory of Open Access Journals (Sweden)

    Humayun Irshad

    2013-01-01

    Full Text Available Context: According to Nottingham grading system, mitosis count in breast cancer histopathology is one of three components required for cancer grading and prognosis. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. Aims: The aim is to investigate the various texture features and Hierarchical Model and X (HMAX biologically inspired approach for mitosis detection using machine-learning techniques. Materials and Methods: We propose an approach that assists pathologists in automated mitosis detection and counting. The proposed method, which is based on the most favorable texture features combination, examines the separability between different channels of color space. Blue-ratio channel provides more discriminative information for mitosis detection in histopathological images. Co-occurrence features, run-length features, and Scale-invariant feature transform (SIFT features were extracted and used in the classification of mitosis. Finally, a classification is performed to put the candidate patch either in the mitosis class or in the non-mitosis class. Three different classifiers have been evaluated: Decision tree, linear kernel Support Vector Machine (SVM, and non-linear kernel SVM. We also evaluate the performance of the proposed framework using the modified biologically inspired model of HMAX and compare the results with other feature extraction methods such as dense SIFT. Results: The proposed method has been tested on Mitosis detection in breast cancer histological images (MITOS dataset provided for an International Conference on Pattern Recognition (ICPR 2012 contest. The proposed framework achieved 76% recall, 75% precision and 76% F-measure. Conclusions: Different frameworks for classification have been evaluated for mitosis detection. In future work, instead of regions, we intend to compute features on the results of mitosis contour segmentation and use them to improve detection and

  8. Automated negotiation in environmental resource management: Review and assessment.

    Science.gov (United States)

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Automated solid-phase peptide synthesis to obtain therapeutic peptides

    Directory of Open Access Journals (Sweden)

    Veronika Mäde

    2014-05-01

    Full Text Available The great versatility and the inherent high affinities of peptides for their respective targets have led to tremendous progress for therapeutic applications in the last years. In order to increase the drugability of these frequently unstable and rapidly cleared molecules, chemical modifications are of great interest. Automated solid-phase peptide synthesis (SPPS offers a suitable technology to produce chemically engineered peptides. This review concentrates on the application of SPPS by Fmoc/t-Bu protecting-group strategy, which is most commonly used. Critical issues and suggestions for the synthesis are covered. The development of automated methods from conventional to essentially improved microwave-assisted instruments is discussed. In order to improve pharmacokinetic properties of peptides, lipidation and PEGylation are described as covalent conjugation methods, which can be applied by a combination of automated and manual synthesis approaches. The synthesis and application of SPPS is described for neuropeptide Y receptor analogs as an example for bioactive hormones. The applied strategies represent innovative and potent methods for the development of novel peptide drug candidates that can be manufactured with optimized automated synthesis technologies.

  10. Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications

    Science.gov (United States)

    Chaki, Sagar; Gurfinkel, Arie

    2010-01-01

    We develop a learning-based automated Assume-Guarantee (AG) reasoning framework for verifying omega-regular properties of concurrent systems. We study the applicability of non-circular (AGNC) and circular (AG-C) AG proof rules in the context of systems with infinite behaviors. In particular, we show that AG-NC is incomplete when assumptions are restricted to strictly infinite behaviors, while AG-C remains complete. We present a general formalization, called LAG, of the learning based automated AG paradigm. We show how existing approaches for automated AG reasoning are special instances of LAG.We develop two learning algorithms for a class of systems, called infinite regular systems, that combine finite and infinite behaviors. We show that for infinity-regular systems, both AG-NC and AG-C are sound and complete. Finally, we show how to instantiate LAG to do automated AG reasoning for infinite regular, and omega-regular, systems using both AG-NC and AG-C as proof rules

  11. Automated management of radioactive sources in Saudi Arabia

    International Nuclear Information System (INIS)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.

    2014-01-01

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document

  12. Automated management of radioactive sources in Saudi Arabia

    Science.gov (United States)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.

    2014-09-01

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.

  13. Automated management of radioactive sources in Saudi Arabia

    Energy Technology Data Exchange (ETDEWEB)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R. [National Center for Radiation Protection, King Abdulaziz City for Science and Technology, 11442 Riyadh (Saudi Arabia)

    2014-09-30

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.

  14. Estimating Regional Mass Balance of Himalayan Glaciers Using Hexagon Imagery: An Automated Approach

    Science.gov (United States)

    Maurer, J. M.; Rupper, S.

    2013-12-01

    Currently there is much uncertainty regarding the present and future state of Himalayan glaciers, which supply meltwater for river systems vital to more than 1.4 billion people living throughout Asia. Previous assessments of regional glacier mass balance in the Himalayas using various remote sensing and field-based methods give inconsistent results, and most assessments are over relatively short (e.g., single decade) timescales. This study aims to quantify multi-decadal changes in volume and extent of Himalayan glaciers through efficient use of the large database of declassified 1970-80s era Hexagon stereo imagery. Automation of the DEM extraction process provides an effective workflow for many images to be processed and glacier elevation changes quantified with minimal user input. The tedious procedure of manual ground control point selection necessary for block-bundle adjustment (as ephemeral data is not available for the declassified images) is automated using the Maximally Stable Extremal Regions algorithm, which matches image elements between raw Hexagon images and georeferenced Landsat 15 meter panchromatic images. Additional automated Hexagon DEM processing, co-registration, and bias correction allow for direct comparison with modern ASTER and SRTM elevation data, thus quantifying glacier elevation and area changes over several decades across largely inaccessible mountainous regions. As consistent methodology is used for all glaciers, results will likely reveal significant spatial and temporal patterns in regional ice mass balance. Ultimately, these findings could have important implications for future water resource management in light of environmental change.

  15. Validation of an automated surveillance approach for drain-related meningitis : A multicenter study

    NARCIS (Netherlands)

    Van Mourik, Maaike S M; Troelstra, Annet; Van Der Sprenkel, Jan Willem Berkelbach; Van Der Jagt-Zwetsloot, Marischka C E; Nelson, Jolande H.; Vos, Piet; Arts, Mark P.; Dennesen, Paul J W; Moons, K. (Carl) G.M.; Bonten, Marc J M

    2015-01-01

    Objective. Manual surveillance of healthcare-associated infections is cumbersome and vulnerable to subjective interpretation. Automated systems are under development to improve efficiency and reliability of surveillance, for example by selecting high-risk patients requiring manual chart review. In

  16. M-Track: A New Software for Automated Detection of Grooming Trajectories in Mice.

    Directory of Open Access Journals (Sweden)

    Sheldon L Reeves

    2016-09-01

    Full Text Available Grooming is a complex and robust innate behavior, commonly performed by most vertebrate species. In mice, grooming consists of a series of stereotyped patterned strokes, performed along the rostro-caudal axis of the body. The frequency and duration of each grooming episode is sensitive to changes in stress levels, social interactions and pharmacological manipulations, and is therefore used in behavioral studies to gain insights into the function of brain regions that control movement execution and anxiety. Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode, and are often performed on grooming episodes triggered by stress exposure, which may not be entirely representative of spontaneous grooming in freely-behaving mice. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand movement stereotypy and bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual forepaws. Here we describe a simple experimental set-up and a novel open-source code, named M-Track, for simultaneously tracking the movement of individual forepaws during spontaneous grooming in multiple freely-behaving mice. This toolbox provides a simple platform to perform trajectory analysis of forepaw movement during distinct grooming episodes. By using M-track we show that, in C57BL/6 wild type mice, the speed and bilateral coordination of the left and right forepaws remain unaltered during the execution of distinct grooming episodes. Stress exposure induces a profound increase in the length of the forepaw grooming trajectories. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

  17. M-Track: A New Software for Automated Detection of Grooming Trajectories in Mice.

    Science.gov (United States)

    Reeves, Sheldon L; Fleming, Kelsey E; Zhang, Lin; Scimemi, Annalisa

    2016-09-01

    Grooming is a complex and robust innate behavior, commonly performed by most vertebrate species. In mice, grooming consists of a series of stereotyped patterned strokes, performed along the rostro-caudal axis of the body. The frequency and duration of each grooming episode is sensitive to changes in stress levels, social interactions and pharmacological manipulations, and is therefore used in behavioral studies to gain insights into the function of brain regions that control movement execution and anxiety. Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode, and are often performed on grooming episodes triggered by stress exposure, which may not be entirely representative of spontaneous grooming in freely-behaving mice. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand movement stereotypy and bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual forepaws. Here we describe a simple experimental set-up and a novel open-source code, named M-Track, for simultaneously tracking the movement of individual forepaws during spontaneous grooming in multiple freely-behaving mice. This toolbox provides a simple platform to perform trajectory analysis of forepaw movement during distinct grooming episodes. By using M-track we show that, in C57BL/6 wild type mice, the speed and bilateral coordination of the left and right forepaws remain unaltered during the execution of distinct grooming episodes. Stress exposure induces a profound increase in the length of the forepaw grooming trajectories. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

  18. Modelling and experimental study for automated congestion driving

    NARCIS (Netherlands)

    Urhahne, Joseph; Piastowski, P.; van der Voort, Mascha C.; Bebis, G; Boyle, R.; Parvin, B.; Koracin, D.; Pavlidis, I.; Feris, R.; McGraw, T.; Elendt, M.; Kopper, R.; Ragan, E.; Ye, Z.; Weber, G.

    2015-01-01

    Taking a collaborative approach in automated congestion driving with a Traffic Jam Assist system requires the driver to take over control in certain traffic situations. In order to warn the driver appropriately, warnings are issued (“pay attention” vs. “take action”) due to a control transition

  19. Choosing the Right Integrator for Your Building Automation Project.

    Science.gov (United States)

    Podgorski, Will

    2002-01-01

    Examines the prevailing definitions and responsibilities of product, network, and system integrators for building automation systems; offers a novel approach to system integration; and sets realistic expectations for the owner in terms of benefits, outcomes, and overall values. (EV)

  20. Roof Box Shape Streamline Adaptation and the Impact towards Fuel Consumption

    Directory of Open Access Journals (Sweden)

    Abdul Latif M.F.

    2017-01-01

    Full Text Available The fuel price hike is currently a sensational national issue in Malaysia. Since the rationalization of fuel subsidies many were affected especially the middle income family. Vehicle aerodynamic were directly related to the fuel consumption, were extra frontal area result a higher drag force hence higher fuel consumption. Roof box were among the largest contributor to the extra drag, thus the roof box shape rationalization were prominent to reduce the extra drag. The idea of adopting water drop shape to the roof box design shows prominent result. The roof box has been simulated using MIRA virtual wind tunnel modelling via commercial computational fluid dynamic (CFD package. This streamline shape drastically reduce the drag force by 34% resulting to a 1.7% fuel saving compare to the conventional boxy roof box. This is an effort to reduce the carbon foot print for a sustainable green world.

  1. On the automated assessment of nuclear reactor systems code accuracy

    International Nuclear Information System (INIS)

    Kunz, Robert F.; Kasmala, Gerald F.; Mahaffy, John H.; Murray, Christopher J.

    2002-01-01

    An automated code assessment program (ACAP) has been developed to provide quantitative comparisons between nuclear reactor systems (NRS) code results and experimental measurements. The tool provides a suite of metrics for quality of fit to specific data sets, and the means to produce one or more figures of merit (FOM) for a code, based on weighted averages of results from the batch execution of a large number of code-experiment and code-code data comparisons. Accordingly, this tool has the potential to significantly streamline the verification and validation (V and V) processes in NRS code development environments which are characterized by rapidly evolving software, many contributing developers and a large and growing body of validation data. In this paper, a survey of data conditioning and analysis techniques is summarized which focuses on their relevance to NRS code accuracy assessment. A number of methods are considered for their applicability to the automated assessment of the accuracy of NRS code simulations. A variety of data types and computational modeling methods are considered from a spectrum of mathematical and engineering disciplines. The goal of the survey was to identify needs, issues and techniques to be considered in the development of an automated code assessment procedure, to be used in United States Nuclear Regulatory Commission (NRC) advanced thermal-hydraulic T/H code consolidation efforts. The ACAP software was designed based in large measure on the findings of this survey. An overview of this tool is summarized and several NRS data applications are provided. The paper is organized as follows: The motivation for this work is first provided by background discussion that summarizes the relevance of this subject matter to the nuclear reactor industry. Next, the spectrum of NRS data types are classified into categories, in order to provide a basis for assessing individual comparison methods. Then, a summary of the survey is provided, where each

  2. Safety assessment of automated vehicle functions by simulation-based fault injection

    OpenAIRE

    Juez, Garazi; Amparan, Estibaliz; Lattarulo, Ray; Rastelli, Joshue Perez; Ruiz, Alejandra; Espinoza, Huascar

    2017-01-01

    As automated driving vehicles become more sophisticated and pervasive, it is increasingly important to assure its safety even in the presence of faults. This paper presents a simulation-based fault injection approach (Sabotage) aimed at assessing the safety of automated vehicle functions. In particular, we focus on a case study to forecast fault effects during the model-based design of a lateral control function. The goal is to determine the acceptable fault detection interval for pe...

  3. Advances in automated valuation modeling AVM after the non-agency mortgage crisis

    CERN Document Server

    Kauko, Tom

    2017-01-01

    This book addresses several problems related to automated valuation methodologies (AVM). Following the non-agency mortgage crisis, it offers a variety of approaches to improve the efficiency and quality of an automated valuation methodology (AVM) dealing with emerging problems and different contexts. Spatial issue, evolution of AVM standards, multilevel models, fuzzy and rough set applications and quantitative methods to define comparables are just some of the topics discussed.

  4. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  5. Service-oriented architectural framework for support and automation of collaboration tasks

    Directory of Open Access Journals (Sweden)

    Ana Sasa

    2011-06-01

    Full Text Available Due to more and more demanding requirements for business flexibility and agility, automation of end-to-end industrial processes has become an important topic. Systems supporting business process execution need to enable automated tasks execution as well as integrate human performed tasks (human tasks into a business process. In this paper, we focus on collaboration tasks, which are an important type of composite human tasks. We propose a service-oriented architectural framework describing a service responsible for human task execution (Human task service, which not only implements collaboration tasks but also improves their execution by automated and semi-automated decision making and collaboration based on ontologies and agent technology. The approach is very generic and can be used for any type of business processes. A case study was performed for a human task intensive business process from an electric power transmission domain.

  6. Mastering Grunt

    CERN Document Server

    Li, Daniel

    2014-01-01

    This easy-to-understand tutorial provides you with several engaging projects that show you how to utilize Grunt with various web technologies, teaching you how to master build automation and testing with Grunt in your applications.If you are a JavaScript developer who is looking to streamline their workflow with build-automation, then this book will give you a kick start in fully understanding the importance of the described web technologies and automate their processes using Grunt.

  7. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  8. A HUMAN AUTOMATION INTERACTION CONCEPT FOR A SMALL MODULAR REACTOR CONTROL ROOM

    Energy Technology Data Exchange (ETDEWEB)

    Le Blanc, Katya; Spielman, Zach; Hill, Rachael

    2017-06-01

    Many advanced nuclear power plant (NPP) designs incorporate higher degrees of automation than the existing fleet of NPPs. Automation is being introduced or proposed in NPPs through a wide variety of systems and technologies, such as advanced displays, computer-based procedures, advanced alarm systems, and computerized operator support systems. Additionally, many new reactor concepts, both full scale and small modular reactors, are proposing increased automation and reduced staffing as part of their concept of operations. However, research consistently finds that there is a fundamental tradeoff between system performance with increased automation and reduced human performance. There is a need to address the question of how to achieve high performance and efficiency of high levels of automation without degrading human performance. One example of a new NPP concept that will utilize greater degrees of automation is the SMR concept from NuScale Power. The NuScale Power design requires 12 modular units to be operated in one single control room, which leads to a need for higher degrees of automation in the control room. Idaho National Laboratory (INL) researchers and NuScale Power human factors and operations staff are working on a collaborative project to address the human performance challenges of increased automation and to determine the principles that lead to optimal performance in highly automated systems. This paper will describe this concept in detail and will describe an experimental test of the concept. The benefits and challenges of the approach will be discussed.

  9. Framework to Implement Collaborative Robots in Manual Assembly: A Lean Automation Approach

    DEFF Research Database (Denmark)

    Malik, Ali Ahmad; Bilberg, Arne

    The recent proliferation of smart manufacturing technologies has emerged the concept of hybrid automation for assembly systems utilizing the best of humans and robots in a combination. Based on the ability to work alongside human-workers the next generation of industrial robots (or robotics 2...... of virtual simulations is discussed for validation and optimization of human-robot work environment....

  10. Automating Formative and Summative Feedback for Individualised Assignments

    Science.gov (United States)

    Hamilton, Ian Robert

    2009-01-01

    Purpose: The purpose of this paper is to report on the rationale behind the use of a unique paper-based individualised accounting assignment, which automated the provision to students of immediate formative and timely summative feedback. Design/methodology/approach: As students worked towards completing their assignment, the package provided…

  11. Automation of column-based radiochemical separations. A comparison of fluidic, robotic, and hybrid architectures

    Energy Technology Data Exchange (ETDEWEB)

    Grate, J.W.; O' Hara, M.J.; Farawila, A.F.; Ozanich, R.M.; Owsley, S.L. [Pacific Northwest National Laboratory, Richland, WA (United States)

    2011-07-01

    Two automated systems have been developed to perform column-based radiochemical separation procedures. These new systems are compared with past fluidic column separation architectures, with emphasis on using disposable components so that no sample contacts any surface that any other sample has contacted, and setting up samples and columns in parallel for subsequent automated processing. In the first new approach, a general purpose liquid handling robot has been modified and programmed to perform anion exchange separations using 2 mL bed columns in 6 mL plastic disposable column bodies. In the second new approach, a fluidic system has been developed to deliver clean reagents through disposable manual valves to six disposable columns, with a mechanized fraction collector that positions one of four rows of six vials below the columns. The samples are delivered to each column via a manual 3-port disposable valve from disposable syringes. This second approach, a hybrid of fluidic and mechanized components, is a simpler more efficient approach for performing anion exchange procedures for the recovery and purification of plutonium from samples. The automation architectures described can also be adapted to column-based extraction chromatography separations. (orig.)

  12. Accelerating the discovery of materials for clean energy in the era of smart automation

    Science.gov (United States)

    Tabor, Daniel P.; Roch, Loïc M.; Saikin, Semion K.; Kreisbeck, Christoph; Sheberla, Dennis; Montoya, Joseph H.; Dwaraknath, Shyam; Aykol, Muratahan; Ortiz, Carlos; Tribukait, Hermann; Amador-Bedolla, Carlos; Brabec, Christoph J.; Maruyama, Benji; Persson, Kristin A.; Aspuru-Guzik, Alán

    2018-05-01

    The discovery and development of novel materials in the field of energy are essential to accelerate the transition to a low-carbon economy. Bringing recent technological innovations in automation, robotics and computer science together with current approaches in chemistry, materials synthesis and characterization will act as a catalyst for revolutionizing traditional research and development in both industry and academia. This Perspective provides a vision for an integrated artificial intelligence approach towards autonomous materials discovery, which, in our opinion, will emerge within the next 5 to 10 years. The approach we discuss requires the integration of the following tools, which have already seen substantial development to date: high-throughput virtual screening, automated synthesis planning, automated laboratories and machine learning algorithms. In addition to reducing the time to deployment of new materials by an order of magnitude, this integrated approach is expected to lower the cost associated with the initial discovery. Thus, the price of the final products (for example, solar panels, batteries and electric vehicles) will also decrease. This in turn will enable industries and governments to meet more ambitious targets in terms of reducing greenhouse gas emissions at a faster pace.

  13. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  14. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  15. Surrogate-Assisted Genetic Programming With Simplified Models for Automated Design of Dispatching Rules.

    Science.gov (United States)

    Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen

    2017-09-01

    Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.

  16. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  17. A new approach to the inverse problem for current mapping in thin-film superconductors

    Science.gov (United States)

    Zuber, J. W.; Wells, F. S.; Fedoseev, S. A.; Johansen, T. H.; Rosenfeld, A. B.; Pan, A. V.

    2018-03-01

    A novel mathematical approach has been developed to complete the inversion of the Biot-Savart law in one- and two-dimensional cases from measurements of the perpendicular component of the magnetic field using the well-developed Magneto-Optical Imaging technique. Our approach, especially in the 2D case, is provided in great detail to allow a straightforward implementation as opposed to those found in the literature. Our new approach also refines our previous results for the 1D case [Johansen et al., Phys. Rev. B 54, 16264 (1996)], and streamlines the method developed by Jooss et al. [Physica C 299, 215 (1998)] deemed as the most accurate if compared to that of Roth et al. [J. Appl. Phys. 65, 361 (1989)]. We also verify and streamline the iterative technique, which was developed following Laviano et al. [Supercond. Sci. Technol. 16, 71 (2002)] to account for in-plane magnetic fields caused by the bending of the applied magnetic field due to the demagnetising effect. After testing on magneto-optical images of a high quality YBa2Cu3O7 superconducting thin film, we show that the procedure employed is effective.

  18. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  19. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  20. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  1. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    Science.gov (United States)

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  2. Modularity and Architecture of PLC-based Software for Automated Production Systems: An analysis in industrial companies

    OpenAIRE

    B. Vogel-Heuser, J. Fischer, S. Feldmann, S. Ulewicz, S. Rösch

    2018-01-01

    Adaptive and flexible production systems require modular and reusable software especially considering their long-term life cycle of up to 50 years. SWMAT4aPS, an approach to measure Software Maturity for automated Production Systems is introduced. The approach identifies weaknesses and strengths of various companies’ solutions for modularity of software in the design of automated Production Systems (aPS). At first, a self-assessed questionnaire is used to evaluate a large number of companies ...

  3. Effects of the Meetings-Flow Approach on Quality Teamwork in the Training of Software Capstone Projects

    Science.gov (United States)

    Chen, Chung-Yang; Hong, Ya-Chun; Chen, Pei-Chi

    2014-01-01

    Software development relies heavily on teamwork; determining how to streamline this collaborative development is an essential training subject in computer and software engineering education. A team process known as the meetings-flow (MF) approach has recently been introduced in software capstone projects in engineering programs at various…

  4. An automated Pearson's correlation change classification (APC3) approach for GC/MS metabonomic data using total ion chromatograms (TICs).

    Science.gov (United States)

    Prakash, Bhaskaran David; Esuvaranathan, Kesavan; Ho, Paul C; Pasikanti, Kishore Kumar; Chan, Eric Chun Yong; Yap, Chun Wei

    2013-05-21

    A fully automated and computationally efficient Pearson's correlation change classification (APC3) approach is proposed and shown to have overall comparable performance with both an average accuracy and an average AUC of 0.89 ± 0.08 but is 3.9 to 7 times faster, easier to use and have low outlier susceptibility in contrast to other dimensional reduction and classification combinations using only the total ion chromatogram (TIC) intensities of GC/MS data. The use of only the TIC permits the possible application of APC3 to other metabonomic data such as LC/MS TICs or NMR spectra. A RapidMiner implementation is available for download at http://padel.nus.edu.sg/software/padelapc3.

  5. Evaluating an Automated Approach for Monitoring Forest Disturbances in the Pacific Northwest from Logging, Fire and Insect Outbreaks with Landsat Time Series Data

    Science.gov (United States)

    R.Neigh, Christopher S.; Bolton, Douglas K.; Williams, Jennifer J.; Diabate, Mouhamad

    2014-01-01

    Forests are the largest aboveground sink for atmospheric carbon (C), and understanding how they change through time is critical to reduce our C-cycle uncertainties. We investigated a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 1991 in Pacific Northwest forests, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometers (AVHRRs). To understand the causal factors of this decline, we evaluated an automated classification method developed for Landsat time series stacks (LTSS) to map forest change. This method included: (1) multiple disturbance index thresholds; and (2) a spectral trajectory-based image analysis with multiple confidence thresholds. We produced 48 maps and verified their accuracy with air photos, monitoring trends in burn severity data and insect aerial detection survey data. Area-based accuracy estimates for change in forest cover resulted in producer's and user's accuracies of 0.21 +/- 0.06 to 0.38 +/- 0.05 for insect disturbance, 0.23 +/- 0.07 to 1 +/- 0 for burned area and 0.74 +/- 0.03 to 0.76 +/- 0.03 for logging. We believe that accuracy was low for insect disturbance because air photo reference data were temporally sparse, hence missing some outbreaks, and the annual anniversary time step is not dense enough to track defoliation and progressive stand mortality. Producer's and user's accuracy for burned area was low due to the temporally abrupt nature of fire and harvest with a similar response of spectral indices between the disturbance index and normalized burn ratio. We conclude that the spectral trajectory approach also captures multi-year stress that could be caused by climate, acid deposition, pathogens, partial harvest, thinning, etc. Our study focused on understanding the transferability of previously successful methods to new ecosystems and found that this automated method does not perform with the same accuracy in Pacific Northwest forests

  6. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  7. Automated dating of the world’s language families based on lexical similarity

    OpenAIRE

    Holman, E.; Brown, C.; Wichmann, S.; Müller, A.; Velupillai, V.; Hammarström, H.; Sauppe, S.; Jung, H.; Bakker, D.; Brown, P.; Belyaev, O.; Urban, M.; Mailhammer, R.; List, J.; Egorov, D.

    2011-01-01

    This paper describes a computerized alternative to glottochronology for estimating elapsed time since parent languages diverged into daughter languages. The method, developed by the Automated Similarity Judgment Program (ASJP) consortium, is different from glottochronology in four major respects: (1) it is automated and thus is more objective, (2) it applies a uniform analytical approach to a single database of worldwide languages, (3) it is based on lexical similarity as determined from Leve...

  8. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  9. The systems approach for applying artificial intelligence to space station automation (Invited Paper)

    Science.gov (United States)

    Grose, Vernon L.

    1985-12-01

    The progress of technology is marked by fragmentation -- dividing research and development into ever narrower fields of specialization. Ultimately, specialists know everything about nothing. And hope for integrating those slender slivers of specialty into a whole fades. Without an integrated, all-encompassing perspective, technology becomes applied in a lopsided and often inefficient manner. A decisionary model, developed and applied for NASA's Chief Engineer toward establishment of commercial space operations, can be adapted to the identification, evaluation, and selection of optimum application of artificial intelligence for space station automation -- restoring wholeness to a situation that is otherwise chaotic due to increasing subdivision of effort. Issues such as functional assignments for space station task, domain, and symptom modules can be resolved in a manner understood by all parties rather than just the person with assigned responsibility -- and ranked by overall significance to mission accomplishment. Ranking is based on the three basic parameters of cost, performance, and schedule. This approach has successfully integrated many diverse specialties in situations like worldwide terrorism control, coal mining safety, medical malpractice risk, grain elevator explosion prevention, offshore drilling hazards, and criminal justice resource allocation -- all of which would have otherwise been subject to "squeaky wheel" emphasis and support of decision-makers.

  10. Lean and Information Technology Toolkit

    Science.gov (United States)

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  11. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  12. Automated MR morphometry to predict Alzheimer's disease in mild cognitive impairment

    Energy Technology Data Exchange (ETDEWEB)

    Fritzsche, Klaus H.; Schlindwein, Sarah; Bruggen, Thomas van; Meinzer, Hans-Peter [German Cancer Research Center, Division of Medical and Biological Informatics, Heidelberg (Germany); Stieltjes, Bram; Essig, Marco [German Cancer Research Center, Division of Radiology, Heidelberg (Germany)

    2010-12-15

    Prediction of progression from mild cognitive impairment (MCI) to Alzheimer's disease (AD) is challenging but essential for early treatment. This study aims to investigate the use of hippocampal atrophy markers for the automatic detection of MCI converters and to compare the predictive value to manually obtained hippocampal volume and temporal horn width. A study was performed with 15 patients with Alzheimer and 18 patients with MCI (ten converted, eight remained stable in a 3-year follow-up) as well as 15 healthy subjects. MRI scans were obtained at baseline and evaluated with an automated system for scoring of hippocampal atrophy. The predictive value of the automated system was compared with manual measurements of hippocampal volume and temporal horn width in the same subjects. The conversion to AD was correctly predicted in 77.8% of the cases (sensitivity 70%, specificity 87.5%) in the MCI group using automated morphometry and a plain linear classifier that was trained on the AD and healthy groups. Classification was improved by limiting analysis to the left cerebral hemisphere (accuracy 83.3%, sensitivity 70%, specificity 100%). The manual linear and volumetric approaches reached rates of 66.7% (40/100%) and 72.2% (60/87.5%), respectively. The automatic approach fulfills many important preconditions for clinical application. Contrary to the manual approaches, it is not observer-dependent and reduces human resource requirements. Automated assessment may be useful for individual patient assessment and for predicting progression to dementia. (orig.)

  13. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    Directory of Open Access Journals (Sweden)

    Chinmay A. Shukla

    2017-05-01

    Full Text Available The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature.

  14. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  15. The juvenile face as a suitable age indicator in child pornography cases: a pilot study on the reliability of automated and visual estimation approaches.

    Science.gov (United States)

    Ratnayake, M; Obertová, Z; Dose, M; Gabriel, P; Bröker, H M; Brauckmann, M; Barkus, A; Rizgeliene, R; Tutkuviene, J; Ritz-Timme, S; Marasciuolo, L; Gibelli, D; Cattaneo, C

    2014-09-01

    In cases of suspected child pornography, the age of the victim represents a crucial factor for legal prosecution. The conventional methods for age estimation provide unreliable age estimates, particularly if teenage victims are concerned. In this pilot study, the potential of age estimation for screening purposes is explored for juvenile faces. In addition to a visual approach, an automated procedure is introduced, which has the ability to rapidly scan through large numbers of suspicious image data in order to trace juvenile faces. Age estimations were performed by experts, non-experts and the Demonstrator of a developed software on frontal facial images of 50 females aged 10-19 years from Germany, Italy, and Lithuania. To test the accuracy, the mean absolute error (MAE) between the estimates and the real ages was calculated for each examiner and the Demonstrator. The Demonstrator achieved the lowest MAE (1.47 years) for the 50 test images. Decreased image quality had no significant impact on the performance and classification results. The experts delivered slightly less accurate MAE (1.63 years). Throughout the tested age range, both the manual and the automated approach led to reliable age estimates within the limits of natural biological variability. The visual analysis of the face produces reasonably accurate age estimates up to the age of 18 years, which is the legally relevant age threshold for victims in cases of pedo-pornography. This approach can be applied in conjunction with the conventional methods for a preliminary age estimation of juveniles depicted on images.

  16. MIDAS: Automated Approach to Design Microwave Integrated Inductors and Transformers on Silicon

    Directory of Open Access Journals (Sweden)

    L. Aluigi

    2013-09-01

    Full Text Available The design of modern radiofrequency integrated circuits on silicon operating at microwave and millimeter-waves requires the integration of several spiral inductors and transformers that are not commonly available in the process design-kits of the technologies. In this work we present an auxiliary CAD tool for Microwave Inductor (and transformer Design Automation on Silicon (MIDAS that exploits commercial simulators and allows the implementation of an automatic design flow, including three-dimensional layout editing and electromagnetic simulations. In detail, MIDAS allows the designer to derive a preliminary sizing of the inductor (transformer on the bases of the design entries (specifications. It draws the inductor (transformer layers for the specific process design kit, including vias and underpasses, with or without patterned ground shield, and launches the electromagnetic simulations, achieving effective design automation with respect to the traditional design flow for RFICs. With the present software suite the complete design time is reduced significantly (typically 1 hour on a PC based on Intel® Pentium® Dual 1.80GHz CPU with 2-GB RAM. Afterwards both the device equivalent circuit and the layout are ready to be imported in the Cadence environment.

  17. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  18. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  19. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  20. Automated Detection of Sepsis Using Electronic Medical Record Data: A Systematic Review.

    Science.gov (United States)

    Despins, Laurel A

    Severe sepsis and septic shock are global issues with high mortality rates. Early recognition and intervention are essential to optimize patient outcomes. Automated detection using electronic medical record (EMR) data can assist this process. This review describes automated sepsis detection using EMR data. PubMed retrieved publications between January 1, 2005 and January 31, 2015. Thirteen studies met study criteria: described an automated detection approach with the potential to detect sepsis or sepsis-related deterioration in real or near-real time; focused on emergency department and hospitalized neonatal, pediatric, or adult patients; and provided performance measures or results indicating the impact of automated sepsis detection. Detection algorithms incorporated systemic inflammatory response and organ dysfunction criteria. Systems in nine studies generated study or care team alerts. Care team alerts did not consistently lead to earlier interventions. Earlier interventions did not consistently translate to improved patient outcomes. Performance measures were inconsistent. Automated sepsis detection is potentially a means to enable early sepsis-related therapy but current performance variability highlights the need for further research.

  1. Streamlined Approach for Environmental Restoration (SAFER) Plan for Corrective Action Unit 411. Double Tracks Plutonium Dispersion (Nellis), Nevada Test and Training Range, Nevada, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Patrick K. [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States)

    2015-03-01

    This Streamlined Approach for Environmental Restoration (SAFER) Plan addresses the actions needed to achieve closure for Corrective Action Unit (CAU) 411, Double Tracks Plutonium Dispersion (Nellis). CAU 411 is located on the Nevada Test and Training Range and consists of a single corrective action site (CAS), NAFR-23-01, Pu Contaminated Soil. There is sufficient information and historical documentation from previous investigations and the 1996 interim corrective action to recommend closure of CAU 411 using the SAFER process. Based on existing data, the presumed corrective action for CAU 411 is clean closure. However, additional data will be obtained during a field investigation to document and verify the adequacy of existing information, and to determine whether the CAU 411 closure objectives have been achieved. This SAFER Plan provides the methodology to gather the necessary information for closing the CAU. The results of the field investigation will be presented in a closure report that will be prepared and submitted to the Nevada Division of Environmental Protection (NDEP) for review and approval. The site will be investigated based on the data quality objectives (DQOs) developed on November 20, 2014, by representatives of NDEP, the U.S. Air Force (USAF), and the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office. The DQO process was used to identify and define the type, amount, and quality of data needed to determine whether CAU 411 closure objectives have been achieved. The following text summarizes the SAFER activities that will support the closure of CAU 411; Collect environmental samples from designated target populations to confirm or disprove the presence of contaminants of concern (COCs) as necessary to supplement existing information; If COCs are no longer present, establish clean closure as the corrective action; If COCs are present, the extent of contamination will be defined and further corrective actions

  2. Towards automated composition of convergent services: A survey

    OpenAIRE

    Ordónez, Armando; Alcazar, Vidal; Rendon, Oscar Mauricio Caicedo; Falcarin, Paolo; Corrales, Juan C.; Granville, Lisandro Zambenedetti

    2015-01-01

    A convergent service is defined as a service that exploits the convergence of\\ud communication networks and at the same time takes advantage of features of\\ud the Web. Nowadays, building up a convergent service is not trivial, because\\ud although there are significant approaches that aim to automate the service\\ud composition at different levels in the Web and Telecom domains, selecting\\ud the most appropriate approach for specific case studies is complex due to\\ud the big amount of involved ...

  3. Automated segmentation of murine lung tumors in x-ray micro-CT images

    Science.gov (United States)

    Swee, Joshua K. Y.; Sheridan, Clare; de Bruin, Elza; Downward, Julian; Lassailly, Francois; Pizarro, Luis

    2014-03-01

    Recent years have seen micro-CT emerge as a means of providing imaging analysis in pre-clinical study, with in-vivo micro-CT having been shown to be particularly applicable to the examination of murine lung tumors. Despite this, existing studies have involved substantial human intervention during the image analysis process, with the use of fully-automated aids found to be almost non-existent. We present a new approach to automate the segmentation of murine lung tumors designed specifically for in-vivo micro-CT-based pre-clinical lung cancer studies that addresses the specific requirements of such study, as well as the limitations human-centric segmentation approaches experience when applied to such micro-CT data. Our approach consists of three distinct stages, and begins by utilizing edge enhancing and vessel enhancing non-linear anisotropic diffusion filters to extract anatomy masks (lung/vessel structure) in a pre-processing stage. Initial candidate detection is then performed through ROI reduction utilizing obtained masks and a two-step automated segmentation approach that aims to extract all disconnected objects within the ROI, and consists of Otsu thresholding, mathematical morphology and marker-driven watershed. False positive reduction is finally performed on initial candidates through random-forest-driven classification using the shape, intensity, and spatial features of candidates. We provide validation of our approach using data from an associated lung cancer study, showing favorable results both in terms of detection (sensitivity=86%, specificity=89%) and structural recovery (Dice Similarity=0.88) when compared against manual specialist annotation.

  4. From drafting guideline to error detection: Automating style checking for legislative texts

    OpenAIRE

    Höfler Stefan; Sugisaki Kyoko

    2012-01-01

    This paper reports on the development of methods for the automated detection of violations of style guidelines for legislative texts, and their implementation in a prototypical tool. To this aim, the approach of error modelling employed in automated style checkers for technical writing is enhanced to meet the requirements of legislative editing. The paper identifies and discusses the two main sets of challenges that have to be tackled in this process: (i) the provision of domain-specific NLP ...

  5. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  6. Rapid Evidence Assessment of the Literature (REAL(©)): streamlining the systematic review process and creating utility for evidence-based health care.

    Science.gov (United States)

    Crawford, Cindy; Boyd, Courtney; Jain, Shamini; Khorsan, Raheleh; Jonas, Wayne

    2015-11-02

    Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice guidelines. Samueli Institute has developed the Rapid Evidence Assessment of the Literature (REAL) SR process to address these issues. REAL provides up-to-date, rigorous, high quality SR information on health care practices, products, or programs in a streamlined, efficient and reliable manner. This process is a component of the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™) program developed by Samueli Institute, which aims at answering the question of "What works?" in health care. The REAL process (1) tailors a standardized search strategy to a specific and relevant research question developed with various stakeholders to survey the available literature; (2) evaluates the quantity and quality of the literature using structured tools and rulebooks to ensure objectivity, reliability and reproducibility of reviewer ratings in an independent fashion and; (3) obtains formalized, balanced input from trained subject matter experts on the implications of the evidence for future research and current practice. Online tools and quality assurance processes are utilized for each step of the review to ensure a rapid, rigorous, reliable, transparent and reproducible SR process. The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care. Using the REAL system allows for the facilitation of recommendations on appropriate next steps in policy, funding

  7. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    Science.gov (United States)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  8. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  9. Automated model fit method for diesel engine control development

    NARCIS (Netherlands)

    Seykens, X.L.J.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.J.H.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  10. Use of automation and robotics for the Space Station

    Science.gov (United States)

    Cohen, Aaron

    1987-01-01

    An overview is presented of the various possible applications of automation and robotics technology to the Space Station system. The benefits of such technology to the private sector and the national economy are addressed. NASA's overall approach to incorporating advanced technology into the Space Station is examined.

  11. Identifying novel drug indications through automated reasoning.

    Directory of Open Access Journals (Sweden)

    Luis Tari

    Full Text Available With the large amount of pharmacological and biological knowledge available in literature, finding novel drug indications for existing drugs using in silico approaches has become increasingly feasible. Typical literature-based approaches generate new hypotheses in the form of protein-protein interactions networks by means of linking concepts based on their cooccurrences within abstracts. However, this kind of approaches tends to generate too many hypotheses, and identifying new drug indications from large networks can be a time-consuming process.In this work, we developed a method that acquires the necessary facts from literature and knowledge bases, and identifies new drug indications through automated reasoning. This is achieved by encoding the molecular effects caused by drug-target interactions and links to various diseases and drug mechanism as domain knowledge in AnsProlog, a declarative language that is useful for automated reasoning, including reasoning with incomplete information. Unlike other literature-based approaches, our approach is more fine-grained, especially in identifying indirect relationships for drug indications.To evaluate the capability of our approach in inferring novel drug indications, we applied our method to 943 drugs from DrugBank and asked if any of these drugs have potential anti-cancer activities based on information on their targets and molecular interaction types alone. A total of 507 drugs were found to have the potential to be used for cancer treatments. Among the potential anti-cancer drugs, 67 out of 81 drugs (a recall of 82.7% are indeed known cancer drugs. In addition, 144 out of 289 drugs (a recall of 49.8% are non-cancer drugs that are currently tested in clinical trials for cancer treatments. These results suggest that our method is able to infer drug indications (original or alternative based on their molecular targets and interactions alone and has the potential to discover novel drug indications for

  12. A Software Architecture for Simulation Support in Building Automation

    Directory of Open Access Journals (Sweden)

    Sergio Leal

    2014-07-01

    Full Text Available Building automation integrates the active components in a building and, thus, has to connect components of different industries. The goal is to provide reliable and efficient operation. This paper describes how simulation can support building automation and how the deployment process of simulation assisted building control systems can be structured. We look at the process as a whole and map it to a set of formally described workflows that can partly be automated. A workbench environment supports the process execution by means of improved planning, collaboration and deployment. This framework allows integration of existing tools, as well as manual tasks, and is, therefore, many more intricate than regular software deployment tools. The complex environment of building commissioning requires expertise in different domains, especially lighting, heating, ventilation, air conditioning, measurement and control technology, as well as energy efficiency; therefore, we present a framework for building commissioning and describe a deployment process that is capable of supporting the various phases of this approach.

  13. The Molecular Industrial Revolution: Automated Synthesis of Small Molecules.

    Science.gov (United States)

    Trobe, Melanie; Burke, Martin D

    2018-04-09

    Today we are poised for a transition from the highly customized crafting of specific molecular targets by hand to the increasingly general and automated assembly of different types of molecules with the push of a button. Creating machines that are capable of making many different types of small molecules on demand, akin to that which has been achieved on the macroscale with 3D printers, is challenging. Yet important progress is being made toward this objective with two complementary approaches: 1) Automation of customized synthesis routes to different targets by machines that enable the use of many reactions and starting materials, and 2) automation of generalized platforms that make many different targets using common coupling chemistry and building blocks. Continued progress in these directions has the potential to shift the bottleneck in molecular innovation from synthesis to imagination, and thereby help drive a new industrial revolution on the molecular scale. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    International Nuclear Information System (INIS)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip; Meinel, Felix G.; Geyer, Lucas L.; Schoepf, U.J.; Apfaltrer, Paul; Canstein, Christian; De Cecco, Carlo Nicola

    2014-01-01

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV M segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV A calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV A (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P 0.9). Automated EFV A quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  15. Automated baseline change detection phase I. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER&WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements.

  16. Automated baseline change detection phase I. Final report

    International Nuclear Information System (INIS)

    1995-12-01

    The Automated Baseline Change Detection (ABCD) project is supported by the DOE Morgantown Energy Technology Center (METC) as part of its ER ampersand WM cross-cutting technology program in robotics. Phase 1 of the Automated Baseline Change Detection project is summarized in this topical report. The primary objective of this project is to apply robotic and optical sensor technology to the operational inspection of mixed toxic and radioactive waste stored in barrels, using Automated Baseline Change Detection (ABCD), based on image subtraction. Absolute change detection is based on detecting any visible physical changes, regardless of cause, between a current inspection image of a barrel and an archived baseline image of the same barrel. Thus, in addition to rust, the ABCD system can also detect corrosion, leaks, dents, and bulges. The ABCD approach and method rely on precise camera positioning and repositioning relative to the barrel and on feature recognition in images. In support of this primary objective, there are secondary objectives to determine DOE operational inspection requirements and DOE system fielding requirements

  17. Automated data mining: an innovative and efficient web-based approach to maintaining resident case logs.

    Science.gov (United States)

    Bhattacharya, Pratik; Van Stavern, Renee; Madhavan, Ramesh

    2010-12-01

    Use of resident case logs has been considered by the Residency Review Committee for Neurology of the Accreditation Council for Graduate Medical Education (ACGME). This study explores the effectiveness of a data-mining program for creating resident logs and compares the results to a manual data-entry system. Other potential applications of data mining to enhancing resident education are also explored. Patient notes dictated by residents were extracted from the Hospital Information System and analyzed using an unstructured mining program. History, examination and ICD codes were obtained and compared to the existing manual log. The automated data History, examination, and ICD codes were gathered for a 30-day period and compared to manual case logs. The automated method extracted all resident dictations with the dates of encounter and transcription. The automated data-miner processed information from all 19 residents, while only 4 residents logged manually. The manual method identified only broad categories of diseases; the major categories were stroke or vascular disorder 53 (27.6%), epilepsy 28 (14.7%), and pain syndromes 26 (13.5%). In the automated method, epilepsy 114 (21.1%), cerebral atherosclerosis 114 (21.1%), and headache 105 (19.4%) were the most frequent primary diagnoses, and headache 89 (16.5%), seizures 94 (17.4%), and low back pain 47 (9%) were the most common chief complaints. More detailed patient information such as tobacco use 227 (42%), alcohol use 205 (38%), and drug use 38 (7%) were extracted by the data-mining method. Manual case logs are time-consuming, provide limited information, and may be unpopular with residents. Data mining is a time-effective tool that may aid in the assessment of resident experience or the ACGME core competencies or in resident clinical research. More study of this method in larger numbers of residency programs is needed.

  18. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  19. Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    Science.gov (United States)

    Feary, Michael S.; Roth, Emilie

    2014-01-01

    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.

  20. Automated concept and relationship extraction for the semi-automated ontology management (SEAM) system.

    Science.gov (United States)

    Doing-Harris, Kristina; Livnat, Yarden; Meystre, Stephane

    2015-01-01

    We develop medical-specialty specific ontologies that contain the settled science and common term usage. We leverage current practices in information and relationship extraction to streamline the ontology development process. Our system combines different text types with information and relationship extraction techniques in a low overhead modifiable system. Our SEmi-Automated ontology Maintenance (SEAM) system features a natural language processing pipeline for information extraction. Synonym and hierarchical groups are identified using corpus-based semantics and lexico-syntactic patterns. The semantic vectors we use are term frequency by inverse document frequency and context vectors. Clinical documents contain the terms we want in an ontology. They also contain idiosyncratic usage and are unlikely to contain the linguistic constructs associated with synonym and hierarchy identification. By including both clinical and biomedical texts, SEAM can recommend terms from those appearing in both document types. The set of recommended terms is then used to filter the synonyms and hierarchical relationships extracted from the biomedical corpus. We demonstrate the generality of the system across three use cases: ontologies for acute changes in mental status, Medically Unexplained Syndromes, and echocardiogram summary statements. Across the three uses cases, we held the number of recommended terms relatively constant by changing SEAM's parameters. Experts seem to find more than 300 recommended terms to be overwhelming. The approval rate of recommended terms increased as the number and specificity of clinical documents in the corpus increased. It was 60% when there were 199 clinical documents that were not specific to the ontology domain and 90% when there were 2879 documents very specific to the target domain. We found that fewer than 100 recommended synonym groups were also preferred. Approval rates for synonym recommendations remained low varying from 43% to 25% as the